TikTok Updates Community Guidelines In Urdu to Help Maintain a Supportive and Welcoming Environment for Its Pakistan Users

TikTok Updates Community Guidelines In Urdu to Help Maintain a Supportive and Welcoming Environment for Its Pakistan Users


Pakistan is amongst one of the top 5 markets with the largest volume of removed videos on TikTok for violating community guidelines or terms of service

KARACHI, August 06, 2020: As the leading global platform for short videos, TikTok has grown increasingly popular in Pakistan by offering a space for fun and creative expression. While users enjoy creating content on TikTok, with it comes the responsibility to keep users safe on the platform. Addressing this, TikTok has released an updated publication of the Community Guidelines in Urdu that will help and maintain a supportive and welcoming environment on TikTok for users in Pakistan.

The Community Guidelines provide general guidance on what is and what is not allowed on the platform, keeping TikTok a safe place for creativity and joy, and are localized and implemented in accordance with local laws and norms. TikTok’s teams remove content that violates the Community Guidelines , and suspends or bans accounts involved in severe or repeated violations.
Content moderation is performed by deploying a combination of policies, technologies, and moderation strategies to detect and review problematic content, accounts, and implement appropriate penalties.
Technology: TikTok’s systems automatically flag certain types of content that may violate its Community Guidelines, enabling it to take swift action and reduce potential harm. These systems take into account things like patterns or behavioral signals to flag potentially violative content.

Content moderation: Technology today isn’t so advanced that a platform can solely rely on it to enforce our policies. For instance, context can be important when determining whether certain content, like satire, is violative. As such, TikTok has a team of trained moderators to help review and remove content. In some cases, this team removes evolving or trending violative content, such as dangerous challenges or harmful misinformation. Another way the TikTok team moderates content is based on reports that it receives from its users. There is a simple in-app reporting feature for users to flag potentially inappropriate content or accounts to TikTok.

In the recent release of its Transparency Report, TikTok shared the global volume of videos removed for violating its Community Guidelines or Terms of Service, which showed that Pakistan is one of the five markets with the largest volume of removed videos. This demonstrates TikTok’s commitment to remove any potentially harmful or inappropriate content reported in Pakistan.




Post a Comment

Previous Post Next Post