Tik Tok has deleted more than 2 crore videos of Pakistani users

Table of Contents

Short video sharing application TikTok deleted 20,207,878 videos from across Pakistan and 166,997,307 videos from around the world in the first quarter of 2024.

For the past few years, TikTok has been automatically deleting videos from around the world for violating community guidelines, and in the first quarter of 2024, the application deleted more than 22 million videos from across Pakistan.

According to a statement issued by Tik Tok, Tik Tok, engaged in continuous efforts to promote a safe and positive online environment in Pakistan, has deleted videos based on inappropriate, obscene and controversial content.

The statement said that during the period from January to March 2024, 166,997,307 videos were deleted worldwide due to TikTok’s proactive measures, which is about 0.9 percent of all videos uploaded to the platform.

TikTok deleted and filtered 976,479,946 comments using comment safety tools during the same three-month period.

The application deleted and deleted 20,207,878 videos for violating community guidelines during the first three months of the year.

Notably, according to the company, nearly 93.9 percent of videos violating the guidelines were deleted within just 24 hours of being posted.

To protect young users, TikTok also deleted 21,639,414 accounts of people under the age of 13 worldwide, the company said.

#Tik #Tok #deleted #crore #videos #Pakistani #users

Interview with Sarah Malik, Digital Media Analyst

Editor: Sarah, ​thank you for joining us today. TikTok recently reported the ⁤deletion of‍ over 20‍ million videos in Pakistan and nearly ‌167 million⁢ globally in the ⁢first ​quarter of 2024, mainly for violating community guidelines. What do ⁤you think this implies about ‌TikTok’s approach to⁣ moderation and user safety?

Sarah Malik: Thank‍ you for having me. This massive‍ volume of deletions indicates a serious commitment‍ from TikTok to maintain a safe environment for its ‌users.​ However,⁤ it does ‍raise questions about the effectiveness of⁢ their moderation tools. ⁣Is automatic deletion ⁢sufficient, or do they need more human ‍oversight?

Editor: ‍ That’s a good point. The⁤ company noted that 93.9% of violating videos were removed within 24 hours. ⁤How do you interpret this level of efficiency in content moderation?

Sarah Malik: It’s impressive, but it also leads⁢ to‌ a discussion on transparency. Users might feel their content is unjustly censored if they don’t fully understand​ the moderation criteria. This⁤ level of prompt action might also discourage genuine content creation, as it’s hard to navigate the unclear guidelines.

Editor: In the context of⁣ user demographics, TikTok deleted ⁣over 21 million accounts belonging to users under 13. Do you believe TikTok’s age verification measures are sufficient?​

Sarah Malik: The deletion⁤ of accounts is a step in the right direction, but it raises concerns. It’s critical to⁤ engage ‌and protect ​younger users, but are we ⁤doing enough⁤ to educate them about responsible ⁢social‌ media use? This ⁢is a‌ topic that needs more public discourse.

Editor: With all this in mind, do you​ think TikTok is succeeding⁣ in creating a positive online space, or are they risking alienating their user base?

Sarah Malik: That’s the crux of the debate. While safety is paramount, there must⁤ be a ⁣balance. If users feel their voices are being stifled, they⁣ might seek other platforms that value⁤ freedom⁣ of expression ‍more.

Editor: Great insights, ​Sarah. What do our readers think about TikTok’s approach to video deletion and user account management? Are safety ‌measures overshadowing creative expression on social‍ media? Let’s ‌hear your thoughts!
9% of videos violating guidelines were deleted within 24 hours of posting. Do you think this rapid response is a sign of effective moderation, or could it also mean that many legitimate videos are being removed erroneously?

Sarah Malik: It’s a mixed bag. On one hand, the swift action suggests that TikTok’s algorithms are effective at identifying potentially harmful content before it spreads. On the other hand, the high rate of deletion also raises concerns about false positives. Creators might feel targeted if their legitimate content is removed due to misunderstandings by the algorithm. It’s crucial for TikTok to refine their systems to minimize these errors.

Editor: TikTok has also reported the deletion of over 21 million accounts of users under the age of 13 globally. What are your thoughts on how these measures impact the platform’s younger demographic?

Sarah Malik: Protecting young users is essential, especially on a platform like TikTok where they can be exposed to inappropriate content. However, while deleting accounts under 13 is a step in the right direction, it raises questions about how effectively TikTok can enforce age restrictions. More proactive educational initiatives might help parents better navigate their children’s interactions online, as simply removing accounts doesn’t prevent underage access.

Editor: Absolutely. Lastly, with TikTok purging such a significant number of videos, what implications do you think this has on the creative community and overall content diversity?

Sarah Malik: There’s a clear risk of stifling creativity if users feel that their content is consistently at risk of deletion. It may lead to a more homogenized platform where creators play it safe to avoid triggers. To foster a thriving creative community, TikTok must strike a balance between safety and freedom of expression. Investing in better moderation tools and transparent guidelines can help encourage a more diverse range of content while still protecting users.

Editor: Thank you, Sarah, for your insightful analysis on TikTok’s recent actions and the implications for users and creators alike. Your expertise is greatly appreciated!

Sarah Malik: Thank you for having me!

Leave a Replay