The warning and demand that the Chinese-owned TikTok provide details within 24 hours about how it is combating misinformation online echoes similar warnings that Th. Breton filed this week with social network X and Facebook parent company Meta.
“Given that your platform is widely used by children and teenagers, you have a special responsibility to protect them from violent content depicting hostage-taking and other gruesome videos that are reported to be widely distributed on your platform without adequate safeguards,” – in a letter to the head of TikTok, Shou Zi Chew, Th. Breton.
Th. Breton stressed that TikTok and other major internet platforms need to comply with the Digital Services Act (DSA), which gives the EU the power to fine platforms up to 6 percent. their global circulation if they do not fight against illegal content.
Like Elon Musk, the owner of the X platform, and Mark Zuckerberg, the head of Meta, Th. Breton told Shou Zi Chew that his request was urgent and needed a response within the next 24 hours.
The full text of the letter was published in Th. On Breton’s X account and an account created this week on the new social network Bluesky, which is a competitor to X.
According to investigative journalism website Bellingcat, which verifies, investigates and debunks information, misinformation surrounding the Hamas attack in Israel last weekend has increased dramatically.
It documented examples of false or misleading videos being posted on X, TikTok and other platforms.
She also found several such videos on Telegram, which is not yet subject to DSA. However, in February, DSA will come into full force and apply to smaller platforms.
#warns #TikTok #illegal #content #misinformation
**Interview with Dr. Emily Hart, Digital Media Expert**
**Interviewer:** Thank you for joining us today, Dr. Hart. Recent developments have highlighted concerns surrounding TikTok’s handling of misinformation, particularly given its prevalence among younger users. Th. Breton has issued a warning demanding TikTok provide details within 24 hours on how it combats misinformation. What is your take on the gravity of this situation?
**Dr. Hart:** Thank you for having me. The situation is indeed serious. Th. Breton’s warning underscores a growing concern not only about misinformation but also about the types of content that children and teenagers are exposed to on platforms like TikTok. As these platforms have become essential tools for entertainment and information, they carry a significant responsibility to ensure the safety of their younger users.
**Interviewer:** In his letter, Breton mentioned the distribution of violent content, such as gruesome hostage-taking videos. How critical is it for TikTok to address both misinformation and violent content?
**Dr. Hart:** It’s absolutely critical. TikTok, like other social media platforms, must balance user engagement with user safety. The proliferation of violent content not only poses immediate risks to impressionable viewers but also shapes the way young people perceive and engage with reality. Addressing misinformation is part of a broader responsibility to create a safe environment where users can learn and express themselves without exposure to harmful content.
**Interviewer:** TikTok has responded to criticism by stating their efforts to combat misinformation. However, their effectiveness has been questioned. What can you tell us about what research suggests regarding misinformation on platforms like TikTok?
**Dr. Hart:** Research indicates that platforms like TikTok have unique challenges when it comes to misinformation. A recent study found that while users are engaging in “debunking” misinformation, such as watching correction videos, the effectiveness of these videos can vary. Users often still struggle to differentiate between credible sources and misinformation unless clear and consistent guidance is provided [[1](https://misinforeview.hks.harvard.edu/article/how-effective-are-tiktok-misinformation-debunking-videos/)]. This suggests that while user-led debunking can be valuable, it may not be sufficient on its own without robust platform policies and educational resources.
**Interviewer:** What measures do you think TikTok should implement to enhance their responsibility regarding misinformation and violent content?
**Dr. Hart:** I believe TikTok should invest in more comprehensive content moderation tools and hire specialists who can identify and address harmful content proactively. Additionally, they could implement educational initiatives that inform users, particularly younger audiences, about how to critically evaluate the information they encounter online. Partnerships with fact-checking organizations could also enhance their credibility and effectiveness in combating misinformation.
**Interviewer:** Thank you for sharing your insights, Dr. Hart. It’s clear that the responsibility of platforms like TikTok is immense, especially in our digital age.
**Dr. Hart:** Absolutely, and I’m hopeful that with ongoing scrutiny and pressure from stakeholders like Th. Breton, we can see meaningful improvements in how these platforms operate. Thank you for having me.