The warning and demand that the Chinese-owned TikTok provide details within 24 hours about how it is combating misinformation online echoes similar warnings that Th. Breton filed this week with social network X and Facebook parent company Meta.
“Given that your platform is widely used by children and teenagers, you have a special responsibility to protect them from violent content depicting hostage-taking and other gruesome videos that are reported to be widely distributed on your platform without adequate safeguards,” – in a letter to the head of TikTok, Shou Zi Chew, Th. Breton.
Th. Breton stressed that TikTok and other major internet platforms need to comply with the Digital Services Act (DSA), which gives the EU the power to fine platforms up to 6 percent. their global circulation if they do not fight against illegal content.
Like Elon Musk, the owner of the X platform, and Mark Zuckerberg, the head of Meta, Th. Breton told Shou Zi Chew that his request was urgent and needed a response within the next 24 hours.
The full text of the letter was published in Th. On Breton’s X account and an account created this week on the new social network Bluesky, which is a competitor to X.
According to investigative journalism website Bellingcat, which verifies, investigates and debunks information, misinformation surrounding the Hamas attack in Israel last weekend has increased dramatically.
It documented examples of false or misleading videos being posted on X, TikTok and other platforms.
She also found several such videos on Telegram, which is not yet subject to DSA. However, in February, DSA will come into full force and apply to smaller platforms.
#warns #TikTok #illegal #content #misinformation
How can parents educate their children about critical media literacy when using platforms like TikTok?
**Interview with Dr. Lisa Tran, Media Literacy Expert**
**Editor:** Welcome, Dr. Tran. Recent developments have put TikTok in the spotlight regarding its handling of misinformation and violent content. Th. Breton, a European Commissioner, has issued a formal demand for TikTok to disclose its strategies for combating misinformation. What are your thoughts on this situation?
**Dr. Tran:** Thank you for having me. This situation highlights a significant concern about social media platforms, especially TikTok, which has a substantial user base among children and teenagers. The request from Th. Breton reflects ongoing worries about the potential harm these platforms can cause to young users by exposing them to manipulated videos and harmful content.
**Editor:** We’ve seen similar warnings issued to other major platforms like X and Meta. Why do you think these demands are being increasingly directed at TikTok?
**Dr. Tran:** TikTok has rapidly gained popularity across younger demographics, making it a unique challenge for regulators. Its algorithm promotes content virally, which can inadvertently spread misinformation and violent imagery without adequate checks. This makes it crucial for TikTok to be transparent about their measures to protect users, as they have a moral responsibility given their audience.
**Editor:** Breton’s letter emphasizes that TikTok has “a special responsibility” in this regard. Can you elaborate on what you believe these responsibilities entail?
**Dr. Tran:** Absolutely. Platforms like TikTok must implement robust content moderation practices, utilize advanced AI tools to identify and filter out violent or misleading content, and create educational initiatives to inform young users about misinformation. There should also be clear reporting mechanisms for harmful content that are accessible to all users.
**Editor:** The insistence on a 24-hour response time is quite pressing. What impact do you think this urgency will have on TikTok’s operations?
**Dr. Tran:** This kind of urgency can often catalyze action. TikTok may need to quickly assemble data and strategies to demonstrate responsiveness. However, it also risks leading to rushed solutions that might not effectively address the underlying issues. The focus should be on sustainable practices rather than just meeting short-term compliance demands.
**Editor:** how do you foresee the future of TikTok and similar platforms if they fail to address these concerns adequately?
**Dr. Tran:** If TikTok and others do not take significant steps to combat misinformation and protect their users, we could see increased regulatory pressures, potential bans in certain regions, or a decline in user trust. Over time, this could change how younger audiences engage with these platforms, redirecting them to safer, more moderated spaces.
**Editor:** Thank you, Dr. Tran, for sharing your insights on this pressing issue. It’s clear that the topic of misinformation on social media remains critical as platforms evolve.
**Dr. Tran:** Thank you for having me. I hope this conversation encourages more users and parents to engage critically with the content on social media platforms.