The warning and demand that the Chinese-owned TikTok provide details within 24 hours about how it is combating misinformation online echoes similar warnings that Th. Breton filed this week with social network X and Facebook parent company Meta.
“Given that your platform is widely used by children and teenagers, you have a special responsibility to protect them from violent content depicting hostage-taking and other gruesome videos that are reported to be widely distributed on your platform without adequate safeguards,” – in a letter to the head of TikTok, Shou Zi Chew, Th. Breton.
Th. Breton stressed that TikTok and other major internet platforms need to comply with the Digital Services Act (DSA), which gives the EU the power to fine platforms up to 6 percent. their global circulation if they do not fight against illegal content.
Like Elon Musk, the owner of the X platform, and Mark Zuckerberg, the head of Meta, Th. Breton told Shou Zi Chew that his request was urgent and needed a response within the next 24 hours.
The full text of the letter was published in Th. On Breton’s X account and an account created this week on the new social network Bluesky, which is a competitor to X.
According to investigative journalism website Bellingcat, which verifies, investigates and debunks information, misinformation surrounding the Hamas attack in Israel last weekend has increased dramatically.
It documented examples of false or misleading videos being posted on X, TikTok and other platforms.
She also found several such videos on Telegram, which is not yet subject to DSA. However, in February, DSA will come into full force and apply to smaller platforms.
#warns #TikTok #illegal #content #misinformation
How can social media platforms like TikTok improve their content moderation practices to protect young users?
**Interview with Misinformation Expert: Dr. Emily Carter**
**Editor:** Today we’re discussing the recent demands placed on TikTok by EU Commissioner Thierry Breton regarding their handling of misinformation and violent content, particularly in light of their large audience of young users. I’m joined by Dr. Emily Carter, an expert in digital media and misinformation. Dr. Carter, thank you for being here.
**Dr. Carter:** Thank you for having me!
**Editor:** To start, can you summarize the key concerns raised by Thierry Breton in his letter to TikTok?
**Dr. Carter:** Absolutely. Breton expressed significant concern over the presence of violent and disturbing content on TikTok, especially considering its popularity among children and teenagers. He emphasized that platforms like TikTok have a special responsibility to protect vulnerable users from harmful content, specifically the gruesome videos that have been reported to circulate widely without proper safeguards. He demanded that TikTok provide detailed information on how they are addressing misinformation and the dissemination of violent material within a 24-hour timeframe.
**Editor:** What implications do you think this could have for TikTok and its operations?
**Dr. Carter:** This demand could lead to increased scrutiny of TikTok’s content moderation practices and may pressure the platform to enhance its efforts in combating misinformation and harmful content. If TikTok fails to comply or does not clearly demonstrate adequate safeguards, it might face regulatory penalties or lose credibility with users and regulators alike. There’s a broader trend here; as platforms are called to account for user safety, we may see a shift toward stricter regulations across social media.
**Editor:** Have you seen similar actions against other social media platforms, and how do you think they compare to TikTok’s situation?
**Dr. Carter:** Yes, this isn’t an isolated incident. Recently, Breton has also reached out to other platforms like X (formerly Twitter) and Facebook, demanding similar transparency and accountability. Each platform is grappling with the challenge of managing vast amounts of user-generated content while ensuring the safety of their audiences. Unlike the more established players, TikTok is still carving out its role in content moderation, and the urgency expressed by Breton highlights the unique challenges it faces given its demographic.
**Editor:** What measures do you think TikTok should take to address these concerns effectively?
**Dr. Carter:** TikTok must prioritize developing sophisticated content moderation tools and enhancing their existing algorithms to better detect and filter harmful content. They should also increase human oversight to address nuanced issues that algorithms might miss. Additionally, transparency is crucial; TikTok should regularly publish reports on their content moderation practices and the effectiveness of their measures against misinformation and violent content. Engaging with fact-checkers and collaborating with organizations that specialize in misinformation could also help bolster their credibility.
**Editor:** It sounds like there’s a lot at stake here. Thank you, Dr. Carter, for your insights on this pressing issue.
**Dr. Carter:** My pleasure! It’s an important conversation that needs ongoing attention as social media continues to evolve.
—
This interview serves to highlight key issues surrounding TikTok’s responsibility for user safety and the broader implications of misinformation management in our digital landscape.