TikTok has introduced updated Community Guidelines that were created to keep users safe and protect the integrity of the platform.
The updated TikTok Community Guidelines include details on the types of content and user behavior that are removed from the platform or that are not acceptable for display in the recommendation feed.
The main updates that will appear on the platform over the next weeks:
- Strengthening TikTok policy on dangerous activities and challenges. Previously, TikTok’s policy in this area was part of the “Suicide, self-harm and dangerous activities” section of the TikTok Community Guidelines, but will now be separated into a separate category with more detailed information so that users can easily and quickly review TikTok’s instructions in this area.
- Expanding interventions for eating disorders. The platform is currently removing content that promotes eating disorders, and will soon also be removing the promotion of malnutrition.
- Increasing transparency regarding hate speech prohibited on the platform. The TikTok Community Guidelines will specify the types of hateful behavior prohibited on the platform. Such behavior includes misogyny, as well as content that supports or promotes reparative therapy programs.
- Policy extension to protect platform security, integrity, availability, and reliability. This clause includes the prohibition of unauthorized access to TikTok, as well as access to the content, accounts, systems or data of the service, and the prohibition of using TikTok to commit criminal activities.
Transparency in relations with users is one of the key principles of the platform. Users will now be prompted to review the updated Community Guidelines when launching the TikTok app, says SearchEngines.
NIXSolutions adds that along with the updated Community Guidelines, TikTok also released the platform’s Community Guidelines Compliance Report for Q3 2021. According to the report, more than 91 million videos that violated the platform’s Community Guidelines were removed during this period, which is about 1% of all uploaded videos. Of these videos, 95.1% were removed before users reported a violation, 88.8% were removed before even one person watched them, and 93.9% were removed within 24 hours of posting.
As a reminder, in December, TikTok launched a “Transparency Hub” where all reports on community rule enforcement, law enforcement requests for information, government takedown requests, and content takedown requests from copyright holders will be collected.