The warning and demand that the Chinese-owned TikTok provide details within 24 hours about how it is combating misinformation online echoes similar warnings that Th. Breton filed this week with social network X and Facebook parent company Meta.
“Given that your platform is widely used by children and teenagers, you have a special responsibility to protect them from violent content depicting hostage-taking and other gruesome videos that are reported to be widely distributed on your platform without adequate safeguards,” – in a letter to the head of TikTok, Shou Zi Chew, Th. Breton.
Th. Breton stressed that TikTok and other major internet platforms need to comply with the Digital Services Act (DSA), which gives the EU the power to fine platforms up to 6 percent. their global circulation if they do not fight against illegal content.
Like Elon Musk, the owner of the X platform, and Mark Zuckerberg, the head of Meta, Th. Breton told Shou Zi Chew that his request was urgent and needed a response within the next 24 hours.
The full text of the letter was published in Th. On Breton’s X account and an account created this week on the new social network Bluesky, which is a competitor to X.
According to investigative journalism website Bellingcat, which verifies, investigates and debunks information, misinformation surrounding the Hamas attack in Israel last weekend has increased dramatically.
It documented examples of false or misleading videos being posted on X, TikTok and other platforms.
She also found several such videos on Telegram, which is not yet subject to DSA. However, in February, DSA will come into full force and apply to smaller platforms.
#warns #TikTok #illegal #content #misinformation
How can social media companies improve their strategies to combat the spread of violent and misleading content online?
**Interview with Social Media Expert Dr. Emily Chen on TikTok and Misinformation**
**Interviewer:** Thank you for joining us today, Dr. Chen. Recently, EU Commissioner Thierry Breton issued a warning to TikTok regarding its handling of misinformation and violent content on the platform. Can you provide some context about why this is such a pressing issue now?
**Dr. Chen:** Thank you for having me. The concern around TikTok has escalated significantly, especially with its vast audience comprising largely minors. As Breton pointed out, platforms like TikTok have a unique responsibility to protect younger users from harmful content, including manipulated videos that can spread misinformation. The increasing prevalence of deepfakes and graphic violence can undermine public trust and even influence political sentiments, particularly with upcoming events like elections [[1](https://www.nytimes.com/2022/11/04/technology/tiktok-deepfakes-disinformation.html)].
**Interviewer:** In his letter to TikTok’s CEO Shou Zi Chew, Breton emphasized the platform’s duty to safeguard its users. What specific steps do you think TikTok should take in response to these warnings?
**Dr. Chen:** TikTok needs to enhance its content moderation and develop stronger algorithms to detect and mitigate harmful content swiftly. Transparency is also crucial; they should provide clear insights into how they identify and handle misinformation. Additionally, implementing educational initiatives for users about misinformation and its dangers could empower the community to critically assess the content they encounter [[1](https://www.nytimes.com/2022/11/04/technology/tiktok-deepfakes-disinformation.html)].
**Interviewer:** Breton’s letter indicates a broader trend of scrutiny facing social media platforms. How do you view the role of regulators in managing online content?
**Dr. Chen:** Regulators like Breton play a vital role in holding these platforms accountable. Given the rapid evolution of technology and the potential risks it poses, especially for younger audiences, proactive regulation is essential. This could include enforcing stricter guidelines for content moderation and requiring companies to report on their misinformation strategies regularly [[1](https://www.nytimes.com/2022/11/04/technology/tiktok-deepfakes-disinformation.html)].
**Interviewer:** what message do you think this situation sends to both TikTok and its competitors like Meta and X?
**Dr. Chen:** The message is clear: platforms must prioritize user safety over engagement metrics. If they fail to take responsibility for the content circulated on their sites, they may face increased regulatory pressure and loss of trust from users. It’s a crucial moment for social media companies to redefine their commitment to protecting their audiences, especially vulnerable ones like children and teenagers [[1](https://www.nytimes.com/2022/11/04/technology/tiktok-deepfakes-disinformation.html)].
**Interviewer:** Thank you, Dr. Chen, for your insights into this important issue. We hope to see meaningful changes in how platforms like TikTok address these challenges.
**Dr. Chen:** Thank you for having me. It’s an important conversation, and I look forward to seeing how this evolves.