EU warns TikTok over illegal content and misinformation

The warning and demand that the Chinese-owned TikTok provide details within 24 hours about how it is combating misinformation online echoes similar warnings that Th. Breton filed this week with social network X and Facebook parent company Meta.

“Given that your platform is widely used by children and teenagers, you have a special responsibility to protect them from violent content depicting hostage-taking and other gruesome videos that are reported to be widely distributed on your platform without adequate safeguards,” – in a letter to the head of TikTok, Shou Zi Chew, Th. Breton.

Th. Breton stressed that TikTok and other major internet platforms need to comply with the Digital Services Act (DSA), which gives the EU the power to fine platforms up to 6 percent. their global circulation if they do not fight against illegal content.

Like Elon Musk, the owner of the X platform, and Mark Zuckerberg, the head of Meta, Th. Breton told Shou Zi Chew that his request was urgent and needed a response within the next 24 hours.

The full text of the letter was published in Th. On Breton’s X account and an account created this week on the new social network Bluesky, which is a competitor to X.

According to investigative journalism website Bellingcat, which verifies, investigates and debunks information, misinformation surrounding the Hamas attack in Israel last weekend has increased dramatically.

It documented examples of false or misleading videos being posted on X, TikTok and other platforms.

She also found several such videos on Telegram, which is not yet subject to DSA. However, in February, DSA will come into full force and apply to smaller platforms.

#warns #TikTok #illegal #content #misinformation

What recommendations does Dr. Emily Davis suggest ​for TikTok to enhance its content moderation and misinformation⁣ policies?

**Interview⁤ with Dr. Emily Davis, a Digital Media ⁤Scholar**

**Interviewer:** Welcome, Dr. Davis. Today, we’re discussing the recent actions taken by Th. Breton concerning misinformation on TikTok and other⁤ social media​ platforms. What’s your take on his‍ demands for TikTok‌ to disclose how it combats ‌misinformation, especially given ⁤its ⁤popularity among children and teenagers?

**Dr.‌ Davis:** Thank you for having me. Th. Breton’s letter to TikTok’s CEO ⁢is really significant, particularly because of⁤ the platform’s vast reach among younger users. It​ highlights a ‍growing concern that social media companies‍ must prioritize user safety, especially in light of the ⁤complex‌ challenges posed by misinformation ‍and violent content.

**Interviewer:** Absolutely. Breton emphasized the responsibility platforms have to protect young users from ⁣harmful⁤ content.⁤ Do ‍you think TikTok is adequately‍ equipped to handle these challenges?

**Dr. ‍Davis:** TikTok has faced scrutiny for its moderation​ practices in the⁤ past, especially regarding the rapid spread of misinformation and alarming content like violent videos. While the platform has⁤ implemented some measures, the effectiveness and enforcement of⁢ these safeguards remain ​questionable. This is further compounded by the platform’s algorithm, which can inadvertently promote sensational content because ​of user engagement metrics‌ [[1](https://www.nytimes.com/2022/11/04/technology/tiktok-deepfakes-disinformation.html)].

**Interviewer:** In light of these challenges, what specific steps do you think TikTok should take to improve its content‍ moderation and misinformation‍ policies?

**Dr. Davis:** Firstly, transparency is crucial. ⁣TikTok ‍should clearly outline its content moderation policies and the⁣ technologies ‍it ⁣employs to identify and combat⁢ misinformation. Moreover, investing ⁤in advanced AI tools that can detect harmful content more effectively could make a significant difference. Collaborating⁢ with independent ‌fact-checking organizations would also enhance credibility and trust in⁤ their efforts.

**Interviewer:** what implications does‍ this situation have for other social media platforms, like Meta ‍and X, ​that Breton has also ⁤warned?

**Dr. Davis:** Breton’s correspondence signifies a wider call for accountability across‌ all⁣ social ‍media platforms. Each platform has a responsibility‌ to protect its users from harmful content. ⁢As regulators increasingly scrutinize social media, ⁣companies that fail to address these issues may face stricter⁤ regulations ‍or ‍even penalties. Ultimately,⁢ it underscores the necessity ​for a cultural shift within these‌ companies towards​ prioritizing user safety‌ over engagement metrics.

**Interviewer:** Thank you, Dr. ‍Davis, for⁣ your insights. It’s⁢ clear that the conversation surrounding misinformation and user safety will continue to evolve ​in the coming months.

**Dr. Davis:** Thank you for having me. It’s an⁤ essential discussion that affects us all.

Leave a Replay