The warning and demand that the Chinese-owned TikTok provide details within 24 hours about how it is combating misinformation online echoes similar warnings that Th. Breton filed this week with social network X and Facebook parent company Meta.
“Given that your platform is widely used by children and teenagers, you have a special responsibility to protect them from violent content depicting hostage-taking and other gruesome videos that are reported to be widely distributed on your platform without adequate safeguards,” – in a letter to the head of TikTok, Shou Zi Chew, Th. Breton.
Th. Breton stressed that TikTok and other major internet platforms need to comply with the Digital Services Act (DSA), which gives the EU the power to fine platforms up to 6 percent. their global circulation if they do not fight against illegal content.
Like Elon Musk, the owner of the X platform, and Mark Zuckerberg, the head of Meta, Th. Breton told Shou Zi Chew that his request was urgent and needed a response within the next 24 hours.
The full text of the letter was published in Th. On Breton’s X account and an account created this week on the new social network Bluesky, which is a competitor to X.
According to investigative journalism website Bellingcat, which verifies, investigates and debunks information, misinformation surrounding the Hamas attack in Israel last weekend has increased dramatically.
It documented examples of false or misleading videos being posted on X, TikTok and other platforms.
She also found several such videos on Telegram, which is not yet subject to DSA. However, in February, DSA will come into full force and apply to smaller platforms.
#warns #TikTok #illegal #content #misinformation
How can TikTok improve its user safety measures to comply with evolving regulations like the Digital Services Act?
**Interview with Dr. Emily Carter, Digital Media Expert**
**Interviewer:** Thank you for joining us today, Dr. Carter. Recently, European Commissioner Thierry Breton issued a stern warning to TikTok, demanding transparency about its measures to combat misinformation and protect younger users. Can you explain why this statement is particularly significant?
**Dr. Carter:** Absolutely, thank you for having me. Commissioner Breton’s warning highlights the serious responsibilities that platforms like TikTok have, especially when their primary users are children and teenagers. The call for transparency is crucial as it illustrates broader concerns about how social media handles misinformation, particularly violent content, which can have severe implications for young users’ mental health and safety.
**Interviewer:** Breton mentioned that TikTok should provide details within 24 hours about its actions against misinformation. How realistic is this demand, and what implications might it have?
**Dr. Carter:** The 24-hour timeframe is quite ambitious, but it signals an urgent need for accountability in how these platforms manage content. If TikTok is unable to provide clear strategies or mechanisms for mitigating misinformation, it raises questions about their operational integrity and potentially their compliance with regulations like the EU’s Digital Services Act. This could lead to stricter regulations or penalties from regulators.
**Interviewer:** TikTok has been under scrutiny for various issues, including privacy and data protection concerns. How does Breton’s letter fit into this larger narrative?
**Dr. Carter:** Breton’s letter is part of an ongoing dialogue about the responsibility of social media companies in safeguarding users. With increasing evidence of harmful content being widely shared, it reflects growing impatience from regulators for more proactive measures from companies like TikTok. These demands are not just about misinformation; they also intersect with broader concerns regarding user safety and mental health, which are particularly important for young audiences who may be more impressionable.
**Interviewer:** what steps could TikTok and similar companies take to better protect their users?
**Dr. Carter:** Transparency is a good start, but TikTok needs to enhance its content moderation strategies. This could involve investing in better AI tools to detect and limit harmful content, increasing human oversight in content moderation, and collaborating with child psychologists to understand the impacts of violent content on younger users. Additionally, they should engage with parents and educators to improve communication about online safety.
**Interviewer:** Thank you, Dr. Carter, for your insights on this pressing issue.
**Dr. Carter:** Thank you for the opportunity to discuss this important topic!