EU warns TikTok over illegal content and misinformation

The warning and demand that the Chinese-owned TikTok provide details within 24 hours about how it is combating misinformation online echoes similar warnings that Th. Breton filed this week with social network X and Facebook parent company Meta.

“Given that your platform is widely used by children and teenagers, you have a special responsibility to protect them from violent content depicting hostage-taking and other gruesome videos that are reported to be widely distributed on your platform without adequate safeguards,” – in a letter to the head of TikTok, Shou Zi Chew, Th. Breton.

Th. Breton stressed that TikTok and other major internet platforms need to comply with the Digital Services Act (DSA), which gives the EU the power to fine platforms up to 6 percent. their global circulation if they do not fight against illegal content.

Like Elon Musk, the owner of the X platform, and Mark Zuckerberg, the head of Meta, Th. Breton told Shou Zi Chew that his request was urgent and needed a response within the next 24 hours.

The full text of the letter was published in Th. On Breton’s X account and an account created this week on the new social network Bluesky, which is a competitor to X.

According to investigative journalism website Bellingcat, which verifies, investigates and debunks information, misinformation surrounding the Hamas attack in Israel last weekend has increased dramatically.

It documented examples of false or misleading videos being posted on X, TikTok and other platforms.

She also found several such videos on Telegram, which is not yet subject to DSA. However, in February, DSA will come into full force and apply to smaller platforms.

#warns #TikTok #illegal #content #misinformation

What⁢ are the main⁤ challenges‍ TikTok‌ faces in combating misinformation and protecting user safety?

**Interview with Dr. Emily‍ Chen, Digital Media Analyst**

**Interviewer**: Thank you for joining⁤ us today, Dr. Chen. ⁢Recently, EU Commissioner‌ Thierry Breton issued a​ warning to ‌TikTok ⁣regarding its handling of misinformation and ⁢violent content. What ⁤are⁤ your thoughts on​ this ​situation?

**Dr. Chen**: Thank you for ‌having ​me. It’s ‌a pressing issue. Breton’s letter to ‌TikTok’s CEO, Shou Zi Chew, highlights a crucial point about the platform’s ‍responsibility, especially considering its ⁣significant user⁤ base​ of children and teenagers. The demand for specific details on how TikTok combats misinformation and protects users from violent content is not just‍ a regulatory step; it reflects growing concerns about the safety of ​young users online.

**Interviewer**: Indeed, TikTok‌ has been under ⁤scrutiny for various reasons. ‌How ​does this warning relate⁣ to the broader ⁤issue of misinformation across social ⁣media platforms?

**Dr. Chen**: ⁣Misinformation⁣ is ⁢a widespread challenge for all social ​media platforms. TikTok, like Facebook‍ and X (formerly Twitter),⁤ faces pressure to ensure that harmful content doesn’t‌ spread unchecked. Breton’s warning⁤ parallels concerns raised about those platforms​ as well, emphasizing that they all have ‌a unique responsibility to mitigate ⁤risks posed to⁤ vulnerable groups. ‌The ​internet, with its vast reach, allows misinformation and violent content to​ proliferate more easily, making oversight more critical.

**Interviewer**: What steps ​do⁢ you think TikTok ‍could take to ⁢effectively address these concerns?

**Dr. Chen**: TikTok has already started initiatives to tackle misinformation, as noted in their recent updates​ about their ongoing efforts [[1](https://newsroom.tiktok.com/en-us/an-update-on-our-work-to-counter-misinformation)]. However, they should enhance transparency by regularly publishing ‍reports about their moderation strategies and the ‍effectiveness of ‍their measures. Collaborating‌ with experts in⁢ child ​safety and digital​ literacy, along‌ with implementing more robust filtering ⁢systems for violent content, could also improve their‌ protective measures.

**Interviewer**: It sounds like there’s a path forward for ⁣TikTok in ​addressing these ⁣issues. Should the public be concerned about the efficacy ‌of such platforms ⁤in protecting ⁤their users?

**Dr. Chen**: Absolutely. The public ‌should remain ⁢vigilant. While regulatory pressures can lead to improvements, the responsibility ultimately falls on the platforms to ​prioritize⁤ user safety. Vigilant users and⁢ parents⁤ should educate themselves on the risks associated with​ these platforms and advocate for ⁢more stringent measures to protect ⁤young users. Engagement from the community can ⁣drive platforms to take more decisive actions.

**Interviewer**: Thank you, Dr. ‌Chen, for your insights on this‍ important topic. ⁢It’s‌ clear that while TikTok and its peers face challenges, there are ways to improve the digital landscape for users.

**Dr. Chen**:​ Thank you for having me. It’s vital that these discussions ​continue as we navigate the complexities of digital media.

Leave a Replay