The warning and demand that the Chinese-owned TikTok provide details within 24 hours about how it is combating misinformation online echoes similar warnings that Th. Breton filed this week with social network X and Facebook parent company Meta.
“Given that your platform is widely used by children and teenagers, you have a special responsibility to protect them from violent content depicting hostage-taking and other gruesome videos that are reported to be widely distributed on your platform without adequate safeguards,” – in a letter to the head of TikTok, Shou Zi Chew, Th. Breton.
Th. Breton stressed that TikTok and other major internet platforms need to comply with the Digital Services Act (DSA), which gives the EU the power to fine platforms up to 6 percent. their global circulation if they do not fight against illegal content.
Like Elon Musk, the owner of the X platform, and Mark Zuckerberg, the head of Meta, Th. Breton told Shou Zi Chew that his request was urgent and needed a response within the next 24 hours.
The full text of the letter was published in Th. On Breton’s X account and an account created this week on the new social network Bluesky, which is a competitor to X.
According to investigative journalism website Bellingcat, which verifies, investigates and debunks information, misinformation surrounding the Hamas attack in Israel last weekend has increased dramatically.
It documented examples of false or misleading videos being posted on X, TikTok and other platforms.
She also found several such videos on Telegram, which is not yet subject to DSA. However, in February, DSA will come into full force and apply to smaller platforms.
#warns #TikTok #illegal #content #misinformation
What measures can TikTok implement to effectively combat misinformation on its platform?
**Interview with Media Analyst Dr. Laura Green on TikTok’s Responsibility in Combating Misinformation**
**Interviewer:** Thank you for joining us today, Dr. Green. Recently, European Commissioner Thierry Breton issued a warning to TikTok regarding its handling of misinformation and violent content. What are your thoughts on the urgency of this demand?
**Dr. Green:** Thank you for having me. I think the urgency of Commissioner Breton’s demand is very much justified. With platforms like TikTok being immensely popular among children and teenagers, it’s crucial for them to implement robust measures against misinformation and harmful content. The responsibility they bear cannot be overstated, as many young users may not have the sophistication to critically evaluate the content they encounter.
**Interviewer:** Breton emphasized the need for TikTok to provide details on their safeguards within 24 hours. How effective do you think such a short timeframe will be in prompting real change?
**Dr. Green:** A 24-hour deadline is certainly an intense request, and while it may seem rushed, it signals to TikTok that regulators are serious about accountability. However, such a short timeframe may not be sufficient for a comprehensive review of their practices. It may prompt an immediate response, but real change requires ongoing commitment and adequate time for implementation.
**Interviewer:** In his letter, Breton mentioned the distribution of violent content, such as hostage-taking videos. What challenges does TikTok face in moderating such content?
**Dr. Green:** TikTok faces significant challenges in content moderation due to the sheer volume of uploads. Their algorithm, while sophisticated, can struggle to differentiate between harmful content and harmless posts, especially when it comes to nuanced issues like humor or commentary. Moreover, the global nature of the platform means that regulatory standards may vary widely, complicating enforcement efforts.
**Interviewer:** What can TikTok do to improve its response to these concerns?
**Dr. Green:** TikTok needs to invest more heavily in content moderation technologies and human resources. This includes employing more content reviewers who are well-trained to recognise violent content and misinformation. They should also enhance transparency by clearly communicating their guidelines and the steps taken when content is flagged. Collaborating with independent organizations to audit their systems can also enhance trust among users and regulators.
**Interviewer:** do you believe that these warnings from regulators like Breton will help to elevate the standards of content moderation across social media platforms?
**Dr. Green:** Absolutely. When high-profile figures like Commissioner Breton make these demands, it raises awareness and puts pressure on social media companies to act responsibly. This kind of scrutiny can drive improvements not just in TikTok, but across the board, as other platforms will be keen to avoid similar criticisms. It’s about setting a precedent for accountability in the digital space.
**Interviewer:** Thank you, Dr. Green, for sharing your insights on this critical issue.
**Dr. Green:** My pleasure. Thank you for having me.