EU warns TikTok over illegal content and misinformation

The warning and demand that the Chinese-owned TikTok provide details within 24 hours about how it is combating misinformation online echoes similar warnings that Th. Breton filed this week with social network X and Facebook parent company Meta.

“Given that your platform is widely used by children and teenagers, you have a special responsibility to protect them from violent content depicting hostage-taking and other gruesome videos that are reported to be widely distributed on your platform without adequate safeguards,” – in a letter to the head of TikTok, Shou Zi Chew, Th. Breton.

Th. Breton stressed that TikTok and other major internet platforms need to comply with the Digital Services Act (DSA), which gives the EU the power to fine platforms up to 6 percent. their global circulation if they do not fight against illegal content.

Like Elon Musk, the owner of the X platform, and Mark Zuckerberg, the head of Meta, Th. Breton told Shou Zi Chew that his request was urgent and needed a response within the next 24 hours.

The full text of the letter was published in Th. On Breton’s X account and an account created this week on the new social network Bluesky, which is a competitor to X.

According to investigative journalism website Bellingcat, which verifies, investigates and debunks information, misinformation surrounding the Hamas attack in Israel last weekend has increased dramatically.

It documented examples of false or misleading videos being posted on X, TikTok and other platforms.

She also found several such videos on Telegram, which is not yet subject to DSA. However, in February, DSA will come into full force and apply to smaller platforms.

#warns #TikTok #illegal #content #misinformation

**What are the psychological effects of violent content on young users of social media platforms like TikTok?**

**Interview with Dr. Emily ⁢Carter, Social Media⁣ Analyst**

**Interviewer:** Thank you for joining us today, Dr. Carter. Recently, EU Commissioner Th. Breton issued ⁢a stern warning ⁣to TikTok, demanding​ details⁣ on how the platform addresses misinformation and violent content,⁣ particularly given ⁢its young user ‍base. What are your thoughts on this demand?

**Dr. Emily Carter:** Thank you for having me. I believe this⁣ warning is crucial, ⁣especially considering that platforms like TikTok are immensely popular among children and teenagers. The responsibility to protect young users from ⁤harmful content is paramount, and regulatory authorities are realizing the urgency of​ implementing robust measures to address these⁣ issues.

**Interviewer:** ⁢Breton’s letter highlighted the distribution of violent content, including gruesome videos. Why is that particularly concerning?

**Dr. ⁣Emily Carter:** Violent content can have severe psychological impacts on young users. Exposure to ‌such material can desensitize⁤ children and teens to violence, foster unhealthy coping mechanisms, and even contribute to anxiety and fear. Moreover, these platforms often serve‌ as⁣ primary news sources for younger demographics, making it critical that they are safeguarded against misinformation and harmful narratives.

**Interviewer:** Speaking ⁢of misinformation, studies have shown that TikTok has a citizen-led approach to debunking false information. How effective do you ⁣think this is ⁣in‌ combating⁢ online misinformation?

**Dr. Emily Carter:** The citizen-led debunking method can be quite effective, especially‍ because it involves users directly.‌ A study found that when individuals watched a misinformation video followed by a correction, their perception of credibility shifted ​positively, benefiting the ⁢overall understanding of the issue [[1](https://misinforeview.hks.harvard.edu/article/how-effective-are-tiktok-misinformation-debunking-videos/)]. However, this approach needs to be supplemented by more systematic efforts from ‍TikTok itself, such​ as algorithm adjustments and more stringent content moderation policies.

**Interviewer:** What steps should‍ TikTok take in response‍ to Breton’s warning to enhance ⁢user safety?

**Dr.⁣ Emily Carter:**⁣ TikTok must implement comprehensive content moderation strategies, including ⁤better detection⁢ algorithms for violent‌ and misleading content. Additionally,⁤ enhancing transparency by sharing data on how misinformation is handled can⁢ help build trust. Engaging with fact-checkers and⁣ providing clearer guidelines for users on how ⁢to‌ report⁢ harmful content would be vital as well.

**Interviewer:** Thank you, Dr. Carter, for your insights into this pressing issue. It’s clear‌ that⁢ as social​ media continues to evolve, so must the measures to protect its users, especially the youngest ones.

**Dr. Emily Carter:** Thank you for having me. It’s essential we continue to prioritize the safety and well-being of young users online.

Leave a Replay