EU warns TikTok over illegal content and misinformation

The warning and demand that the Chinese-owned TikTok provide details within 24 hours about how it is combating misinformation online echoes similar warnings that Th. Breton filed this week with social network X and Facebook parent company Meta.

“Given that your platform is widely used by children and teenagers, you have a special responsibility to protect them from violent content depicting hostage-taking and other gruesome videos that are reported to be widely distributed on your platform without adequate safeguards,” – in a letter to the head of TikTok, Shou Zi Chew, Th. Breton.

Th. Breton stressed that TikTok and other major internet platforms need to comply with the Digital Services Act (DSA), which gives the EU the power to fine platforms up to 6 percent. their global circulation if they do not fight against illegal content.

Like Elon Musk, the owner of the X platform, and Mark Zuckerberg, the head of Meta, Th. Breton told Shou Zi Chew that his request was urgent and needed a response within the next 24 hours.

The full text of the letter was published in Th. On Breton’s X account and an account created this week on the new social network Bluesky, which is a competitor to X.

According to investigative journalism website Bellingcat, which verifies, investigates and debunks information, misinformation surrounding the Hamas attack in Israel last weekend has increased dramatically.

It documented examples of false or misleading videos being posted on X, TikTok and other platforms.

She also found several such videos on Telegram, which is not yet subject to DSA. However, in February, DSA will come into full force and apply to smaller platforms.

#warns #TikTok #illegal #content #misinformation

⁣What strategies ‍can​ TikTok implement to⁤ effectively combat misinformation and protect ​young⁣ users from harmful content?

**Interview ⁢with Digital Safety Expert: Addressing‌ TikTok’s Misinformation ​Challenges**

**Editor:** Welcome, Dr. Emily Carter, a⁣ digital safety ‍expert with‍ extensive experience in social media regulations and⁣ user protection. We’re here ​to discuss the recent⁣ demands aimed at TikTok regarding⁣ misinformation⁣ and content safety,⁢ particularly following the letter issued by Th. Breton.

**Dr. Carter:** Thank you for having me. I’m‌ glad to be part of this important conversation.

**Editor:** Th. Breton has called on TikTok to⁤ provide details within 24 hours about how it is addressing misinformation ‌on its platform. What ‌are your thoughts‍ on the urgency of this demand?

**Dr. ​Carter:** The​ urgency is warranted, especially considering TikTok’s popularity among younger users. Misinformation can have serious ⁢implications for ‌their safety and mental health.⁣ When platforms like TikTok are used by children ⁢and teenagers, accountability becomes⁢ crucial. Breton’s demand⁣ highlights the need ​for transparency in ‍how these​ platforms manage harmful content⁢ [[1](https://www.tiktok.com/safety/en/harmful-misinformation-guide?sc_version=2024)].

**Editor:** You mentioned the safety of young users. What particular risks do ‌platforms like TikTok ‍face ‍with respect to violent ‌content and misinformation?

**Dr. Carter:** The primary risks include exposure to disturbing‌ content, such as violent or graphic videos,​ which can be ‍traumatizing ⁢for young viewers. Additionally, misinformation related to health, safety, and social issues can lead to harmful ⁣behaviors or beliefs. The challenge is that ⁣algorithms can inadvertently promote such⁢ content when they prioritize engagement over​ safety [[1](https://www.tiktok.com/safety/en/harmful-misinformation-guide?sc_version=2024)].

**Editor:** TikTok has‍ introduced measures to counter misinformation, as highlighted in their harmful misinformation guide. How effective do you think ‌these⁣ measures‍ are?

**Dr. Carter:** TikTok’s efforts to counter harmful misinformation, such as connecting ​users ⁣to verified facts and ‌warnings about ‌unauthentic content, ‌are a step in the⁢ right direction. However, the effectiveness largely depends on their implementation. The platform needs robust systems to monitor ‍and control the spread of various types ‍of harmful content proactively, not just reactively [[1](https://www.tiktok.com/safety/en/harmful-misinformation-guide?sc_version=2024)].

**Editor:** In your opinion, what should TikTok and similar platforms prioritize to enhance user safety and combat misinformation?

**Dr. Carter:** First, ‍they should invest ⁤heavily⁣ in improving their ​content moderation systems, ⁣possibly incorporating better AI tools to detect and flag inappropriate content. Second, they must ensure that educational⁢ resources‍ are‌ easily accessible to users about how to recognise misinformation. ⁢Lastly, robust communication with ‍regulatory bodies and the public about⁤ their findings and improvements is critical to rebuild trust [[1](https://www.tiktok.com/safety/en/harmful-misinformation-guide?sc_version=2024)].

**Editor:** Thank you, Dr. Carter, ‌for sharing your⁣ insights ⁣on this urgent⁤ issue. It’s clear ‌that while there ⁣are measures in place, much work remains to protect​ users, especially ⁤the younger audience, from the pitfalls of misinformation and harmful content.

**Dr. Carter:** ‌Thank you for having​ me.‌ It’s ‍vital that we ‌continue to advocate for safer ⁢digital spaces ‍for everyone, particularly vulnerable groups.

Leave a Replay