TikTok has removed more than 334,000 videos from its platform in Kenya for violating community guidelines, as part of its ongoing efforts to maintain a safer online environment.
TikTok's commitment to content moderation
The revelation came from TikTok’s latest Q3 2024 Community Guidelines Enforcement Report, released on Thursday, which details the company’s proactive approach to content moderation.
The report indicates that 88.9% of the videos were removed before they could be viewed by others, with 93% taken down within 24 hours of publishing.
TikTok stated that the move aligns with its mission to ensure user safety and maintain the integrity of interactions on its platform.
The company has faced growing scrutiny over the nature of content shared, with the Kenyan government previously accusing the app of promoting misinformation, fraud, and explicit material.
![](/_next/image?url=https%3A%2F%2Fimage.api.sportal365.com%2Fprocess%2F%2Fsmp-images-production%2Fpulselive.co.ke%2F08122024%2F93ae61d5-cd05-4264-9288-3fbaab8af2c7.jpg&w=3840&q=75)
Integrity and authenticity lead content violations
The report highlighted that the highest number of removals in Kenya fell under the Integrity and Authenticity category.
TikTok disclosed that 99.7% of videos violating this policy were removed before any user reported them, with 94% taken down within 24 hours.
This reflects TikTok’s focus on countering disinformation and fraudulent activities, ensuring that Kenyan users can engage in a more trustworthy digital space.
Another major category flagged in the report was Mental and Behavioural Health. TikTok revealed that 99.9% of harmful content was removed before being reported, with 96.4% taken down within 24 hours.
The platform emphasised that these efforts are crucial in protecting users, particularly younger audiences, from content that could negatively impact their mental well-being.
Youth safety and sensitive content
TikTok also detailed its commitment to youth safety and the removal of inappropriate material.
According to the report, 99.7% of content that violated youth safety policies was removed before any views, with 94.3% taken down within 24 hours.
Similarly, 99.5% of sensitive and mature-themed videos were removed before users could report them, with 95.8% deleted within 24 hours.
These stringent moderation measures come in response to growing concerns from Kenyan authorities about TikTok’s influence.
Government officials have criticised the platform for allegedly facilitating the spread of misleading content and exposing users to explicit material.
TikTok, however, maintains that its enforcement measures are designed to combat such issues effectively.
READ: TikTok unveils inaugural Sub-Saharan Africa Safety Advisory Council [list]
Strengthening online safety
With the removal of over 334,000 videos, TikTok has reinforced its commitment to creating a safer digital environment in Kenya.