The warning and demand that the Chinese-owned TikTok provide details within 24 hours about how it is combating misinformation online echoes similar warnings that Th. Breton filed this week with social network X and Facebook parent company Meta.
“Given that your platform is widely used by children and teenagers, you have a special responsibility to protect them from violent content depicting hostage-taking and other gruesome videos that are reported to be widely distributed on your platform without adequate safeguards,” – in a letter to the head of TikTok, Shou Zi Chew, Th. Breton.
Th. Breton stressed that TikTok and other major internet platforms need to comply with the Digital Services Act (DSA), which gives the EU the power to fine platforms up to 6 percent. their global circulation if they do not fight against illegal content.
Like Elon Musk, the owner of the X platform, and Mark Zuckerberg, the head of Meta, Th. Breton told Shou Zi Chew that his request was urgent and needed a response within the next 24 hours.
The full text of the letter was published in Th. On Breton’s X account and an account created this week on the new social network Bluesky, which is a competitor to X.
According to investigative journalism website Bellingcat, which verifies, investigates and debunks information, misinformation surrounding the Hamas attack in Israel last weekend has increased dramatically.
It documented examples of false or misleading videos being posted on X, TikTok and other platforms.
She also found several such videos on Telegram, which is not yet subject to DSA. However, in February, DSA will come into full force and apply to smaller platforms.
#warns #TikTok #illegal #content #misinformation
What are the main concerns regarding misinformation and harmful content on TikTok, especially for young users?
**Interview with Dr. Emily Roberts, Digital Media Analyst**
**Editor:** Thank you for joining us today, Dr. Roberts. Recent news highlights a letter from Th. Breton to TikTok demanding greater transparency about how the platform is tackling misinformation and harmful content. What prompted this call for action?
**Dr. Roberts:** Thank you for having me. The growing concerns stem from the influence of social media on younger audiences, particularly on platforms like TikTok, which have a significant number of users who are children and teenagers. Breton’s warning reflects a broader issue regarding the responsibility these platforms have to protect their young users from not just misinformation, but also from violent and disturbing content, such as videos depicting hostage situations, which can be harmful to impressionable minds.
**Editor:** What specific actions is TikTok taking in response to these concerns?
**Dr. Roberts:** TikTok has acknowledged these concerns and, as reported, has increased its investment in countering misinformation, especially with the upcoming UK general election in mind. They’ve added a fact-checking expert and are employing AI technologies to help filter out misinformation. However, the effectiveness of these measures remains to be seen, particularly in real-world scenarios where violent or distressing content can rapidly spread [[1](https://www.bbc.co.uk/news/articles/c1ww6vz1l81o)].
**Editor:** In your view, how can platforms like TikTok improve their safeguards against such content?
**Dr. Roberts:** There are several strategies that would be beneficial. Firstly, enhancing the algorithms to better identify and flag harmful content before it goes viral is crucial. Transparency in how these algorithms work is also important, as is involving independent third-party reviewers to audit their processes. Additionally, engaging directly with educational campaigns for users—especially younger ones—about recognizing misinformation and violence online can help foster a more informed user base.
**Editor:** what implications does this situation have for the overall landscape of social media regulation?
**Dr. Roberts:** This could signal a shift toward stricter regulations in the European Union and beyond. As guardians of young users, platforms may face increased scrutiny and requirements to comply with safety standards. If they fail to address these issues adequately, we could see more regulatory actions, potentially including fines or restrictions on how these platforms operate. The pressure is mounting for these companies to take their responsibilities seriously in safeguarding their users [[1](https://www.bbc.co.uk/news/articles/c1ww6vz1l81o)].
**Editor:** Thank you, Dr. Roberts, for your insights on this pressing issue. It’s clear that navigating the balance between free expression and user safety is becoming increasingly complex for platforms like TikTok.