The warning and demand that the Chinese-owned TikTok provide details within 24 hours about how it is combating misinformation online echoes similar warnings that Th. Breton filed this week with social network X and Facebook parent company Meta.
“Given that your platform is widely used by children and teenagers, you have a special responsibility to protect them from violent content depicting hostage-taking and other gruesome videos that are reported to be widely distributed on your platform without adequate safeguards,” – in a letter to the head of TikTok, Shou Zi Chew, Th. Breton.
Th. Breton stressed that TikTok and other major internet platforms need to comply with the Digital Services Act (DSA), which gives the EU the power to fine platforms up to 6 percent. their global circulation if they do not fight against illegal content.
Like Elon Musk, the owner of the X platform, and Mark Zuckerberg, the head of Meta, Th. Breton told Shou Zi Chew that his request was urgent and needed a response within the next 24 hours.
The full text of the letter was published in Th. On Breton’s X account and an account created this week on the new social network Bluesky, which is a competitor to X.
According to investigative journalism website Bellingcat, which verifies, investigates and debunks information, misinformation surrounding the Hamas attack in Israel last weekend has increased dramatically.
It documented examples of false or misleading videos being posted on X, TikTok and other platforms.
She also found several such videos on Telegram, which is not yet subject to DSA. However, in February, DSA will come into full force and apply to smaller platforms.
#warns #TikTok #illegal #content #misinformation
How can TikTok enhance its strategies to filter out harmful content while ensuring transparency and accountability?
### Interview with Dr. Emily Carter on TikTok’s Responsibility in Combating Misinformation
**Editor:** Today, we are joined by Dr. Emily Carter, a social media expert and child psychology researcher, to discuss the recent warnings issued to TikTok and other social media platforms regarding misinformation and the safety of young users. Thank you for being here, Dr. Carter.
**Dr. Carter:** Thank you for having me!
**Editor:** Recently, European Commissioner Thierry Breton demanded that TikTok provide clarity on how it addresses misinformation, particularly in light of concerns about violent content circulating on the platform. What are your thoughts on this demand?
**Dr. Carter:** It’s a critical step. Social media platforms, especially ones like TikTok that are heavily used by children and teenagers, must take their responsibility seriously. With engaging short videos, it can be easy for harmful content to slip through the cracks. Breton’s call for transparency is essential for holding these platforms accountable.
**Editor:** Breton emphasized that TikTok must protect its young users from violent content. How significant is the presence of such content on platforms typically favored by younger audiences?
**Dr. Carter:** The presence of violent or disturbing content on platforms like TikTok can have long-lasting effects on young minds. Seeing graphic imagery can desensitize children or, conversely, induce fear and anxiety. It’s crucial for TikTok to implement robust safeguards to filter out harmful material effectively. The public and regulatory insistence on these protections reflects a growing awareness of the psychological impacts of media consumption.
**Editor:** TikTok has highlighted its efforts to combat misinformation through various initiatives, including transparency reports. Do you think these measures are sufficient?
**Dr. Carter:** While initiatives like transparency reports are a step in the right direction, they can’t be the only solution. TikTok needs to be more proactive in curbing misinformation and ensuring that users, especially minors, are shielded from potentially harmful content. Regular updates and transparency are important, but they need to be backed by effective action.
**Editor:** Given the rapid spread of misinformation on social media, what measures would you recommend TikTok and similar platforms take to improve user safety?
**Dr. Carter:** First, they need to enhance content moderation, utilizing both human oversight and AI tools to identify and remove harmful content swiftly. Secondly, educating users about misinformation—how to recognize it and report it—can empower the community. Lastly, collaborating with child safety organizations to develop best practices and protocols for protecting younger users is vital.
**Editor:** Thank you, Dr. Carter, for your insights on this pressing issue in social media safety. It’s clear that while TikTok has made efforts, much work remains to be done to ensure the safety and well-being of its young users.
**Dr. Carter:** Absolutely, and thank you for bringing attention to it! It’s an important conversation that needs ongoing dialog and action.
—
This interview underlines the critical discussions surrounding TikTok’s responsibility in safeguarding its young audience while addressing the increasing scrutiny from regulators like Thierry Breton.