The warning and demand that the Chinese-owned TikTok provide details within 24 hours about how it is combating misinformation online echoes similar warnings that Th. Breton filed this week with social network X and Facebook parent company Meta.
“Given that your platform is widely used by children and teenagers, you have a special responsibility to protect them from violent content depicting hostage-taking and other gruesome videos that are reported to be widely distributed on your platform without adequate safeguards,” – in a letter to the head of TikTok, Shou Zi Chew, Th. Breton.
Th. Breton stressed that TikTok and other major internet platforms need to comply with the Digital Services Act (DSA), which gives the EU the power to fine platforms up to 6 percent. their global circulation if they do not fight against illegal content.
Like Elon Musk, the owner of the X platform, and Mark Zuckerberg, the head of Meta, Th. Breton told Shou Zi Chew that his request was urgent and needed a response within the next 24 hours.
The full text of the letter was published in Th. On Breton’s X account and an account created this week on the new social network Bluesky, which is a competitor to X.
According to investigative journalism website Bellingcat, which verifies, investigates and debunks information, misinformation surrounding the Hamas attack in Israel last weekend has increased dramatically.
It documented examples of false or misleading videos being posted on X, TikTok and other platforms.
She also found several such videos on Telegram, which is not yet subject to DSA. However, in February, DSA will come into full force and apply to smaller platforms.
#warns #TikTok #illegal #content #misinformation
**Interview with Dr. Emily Rosen, Digital Media Expert**
**Interviewer:** Thank you for joining us today, Dr. Rosen. Recently, European Commissioner Thierry Breton issued a warning to TikTok, demanding they provide information on combating misinformation, particularly highlighting the platform’s responsibility to protect young users. What are your thoughts on this situation?
**Dr. Rosen:** Thank you for having me. This is a critical moment for social media platforms like TikTok, particularly because of their significant user base among children and teenagers. As we’ve seen, misinformation is rampant on these platforms, affecting how young people perceive reality. Breton’s demands reflect a growing concern that these companies must take their responsibilities seriously, especially in terms of safeguarding young users from harmful content.
**Interviewer:** Breton’s letter emphasized that TikTok has a special responsibility due to its young audience. Can you elaborate on why this demographic is particularly vulnerable?
**Dr. Rosen:** Absolutely. Children and teenagers are in a developmental stage where they are still forming their understanding of the world. They may not have the critical thinking skills to discern fact from fiction effectively. This makes them more susceptible to misinformation and disturbing content, such as violence or graphic depictions that are sometimes shared widely without proper moderation [[1](https://www.cnn.com/2022/09/18/business/tiktok-search-engine-misinformation/index.html)].
**Interviewer:** What specific measures should TikTok implement to address these concerns?
**Dr. Rosen:** TikTok should enhance its content moderation processes significantly. This includes employing advanced AI tools to detect and remove violent or misleading content before it reaches users. Additionally, they should prioritize transparency by providing clear guidelines on how they manage and verify information, as well as collaborating with fact-checking organizations to minimize misinformation [[1](https://www.cnn.com/2022/09/18/business/tiktok-search-engine-misinformation/index.html)].
**Interviewer:** do you think the pressure from regulators like Thierry Breton will lead to meaningful change in how TikTok operates?
**Dr. Rosen:** Yes, I believe it can lead to significant changes, especially if similar actions are taken by lawmakers worldwide. Strong regulation can compel platforms to be more accountable and proactive in managing content. It’s essential for the future of digital media and for protecting the younger generations who are more vulnerable to its challenges [[1](https://www.cnn.com/2022/09/18/business/tiktok-search-engine-misinformation/index.html)].
**Interviewer:** Thank you, Dr. Rosen, for your insights. It’s clear that the conversation around social media’s responsibilities is more important than ever.