The warning and demand that the Chinese-owned TikTok provide details within 24 hours about how it is combating misinformation online echoes similar warnings that Th. Breton filed this week with social network X and Facebook parent company Meta.
“Given that your platform is widely used by children and teenagers, you have a special responsibility to protect them from violent content depicting hostage-taking and other gruesome videos that are reported to be widely distributed on your platform without adequate safeguards,” – in a letter to the head of TikTok, Shou Zi Chew, Th. Breton.
Th. Breton stressed that TikTok and other major internet platforms need to comply with the Digital Services Act (DSA), which gives the EU the power to fine platforms up to 6 percent. their global circulation if they do not fight against illegal content.
Like Elon Musk, the owner of the X platform, and Mark Zuckerberg, the head of Meta, Th. Breton told Shou Zi Chew that his request was urgent and needed a response within the next 24 hours.
The full text of the letter was published in Th. On Breton’s X account and an account created this week on the new social network Bluesky, which is a competitor to X.
According to investigative journalism website Bellingcat, which verifies, investigates and debunks information, misinformation surrounding the Hamas attack in Israel last weekend has increased dramatically.
It documented examples of false or misleading videos being posted on X, TikTok and other platforms.
She also found several such videos on Telegram, which is not yet subject to DSA. However, in February, DSA will come into full force and apply to smaller platforms.
#warns #TikTok #illegal #content #misinformation
– What measures should TikTok implement to improve its effectiveness in combating misinformation among young users?
**Interview with Social Media Expert Dr. Emily Chen on TikTok’s Responsibility in Combating Misinformation**
**Interviewer:** Thank you for joining us today, Dr. Chen. Recently, European Commissioner Thierry Breton issued a stern warning to TikTok about their handling of misinformation and violent content, particularly given their young user demographic. What are your thoughts on this demand for immediate transparency?
**Dr. Emily Chen:** Thank you for having me. I think Breton’s demand reflects growing concerns about the safety and well-being of young users on platforms like TikTok. With such a large percentage of their audience being children and teenagers, these platforms do indeed carry a special responsibility to monitor content and prevent the spread of harmful misinformation and violent imagery.
**Interviewer:** Right, and this isn’t just a problem for TikTok; Breton also addressed other platforms, such as X and Meta. Why do you think we are seeing this increased scrutiny across social media platforms now?
**Dr. Emily Chen:** The scrutiny arises from a combination of factors. Firstly, we’re seeing a rise in misinformation that can have real-world consequences, especially during crises like the COVID-19 pandemic. Secondly, incidents involving violent or gruesome content can deeply affect young viewers, leading to emotional distress or desensitization. Regulators are realizing that these platforms cannot be self-regulating; there needs to be external accountability to ensure user safety.
**Interviewer:** TikTok has made efforts to counter misinformation, as noted in their “Countering Harmful Misinformation” guide. How effective do you think these measures are, and what improvements could be made?
**Dr. Emily Chen:** TikTok’s initiatives are a good start, but effectiveness hinges on their implementation and continuous evolution. Their guide aims to connect users to factual information, which is crucial. However, they must also enhance their algorithms to detect violent content more reliably and collaborate with fact-checkers in real time. Furthermore, proactive educational measures for users, especially young ones, about misinformation and online safety would also be beneficial [[1](https://www.tiktok.com/safety/en/harmful-misinformation-guide?sc_version=2024)].
**Interviewer:** do you believe regulations similar to those proposed by Breton are necessary for protecting users on social media?
**Dr. Emily Chen:** Absolutely. Regulatory frameworks can set minimum standards for safety and accountability. It’s essential for platforms to adhere to these guidelines not just to avoid penalties, but to genuinely protect their users. With tech evolving rapidly, regulations will play a crucial role in ensuring that these platforms serve the public interest, especially for vulnerable groups like children and teenagers.
**Interviewer:** Thank you, Dr. Chen, for your insights on this pressing issue. It’s clear that social media platforms have a long way to go in ensuring a safe online environment.
**Dr. Emily Chen:** Thank you for having me. It’s a vital conversation to have as we navigate the complexities of digital safety.