EU warns TikTok over illegal content and misinformation

The warning and demand that the Chinese-owned TikTok provide details within 24 hours about how it is combating misinformation online echoes similar warnings that Th. Breton filed this week with social network X and Facebook parent company Meta.

“Given that your platform is widely used by children and teenagers, you have a special responsibility to protect them from violent content depicting hostage-taking and other gruesome videos that are reported to be widely distributed on your platform without adequate safeguards,” – in a letter to the head of TikTok, Shou Zi Chew, Th. Breton.

Th. Breton stressed that TikTok and other major internet platforms need to comply with the Digital Services Act (DSA), which gives the EU the power to fine platforms up to 6 percent. their global circulation if they do not fight against illegal content.

Like Elon Musk, the owner of the X platform, and Mark Zuckerberg, the head of Meta, Th. Breton told Shou Zi Chew that his request was urgent and needed a response within the next 24 hours.

The full text of the letter was published in Th. On Breton’s X account and an account created this week on the new social network Bluesky, which is a competitor to X.

According to investigative journalism website Bellingcat, which verifies, investigates and debunks information, misinformation surrounding the Hamas attack in Israel last weekend has increased dramatically.

It documented examples of false or misleading videos being posted on X, TikTok and other platforms.

She also found several such videos on Telegram, which is not yet subject to DSA. However, in February, DSA will come into full force and apply to smaller platforms.

#warns #TikTok #illegal #content #misinformation

⁤What strategies can TikTok implement to ‍effectively combat misinformation on their platform, especially for younger users? ‍

**Interview ‌with Dr. Emily Chen,⁢ Media Ethics Expert**

**Interviewer:** Thank you for ⁣joining us today, Dr. Chen. Recently, European Commissioner Thierry Breton issued a ‌warning to TikTok regarding their handling of misinformation and violent content on their platform,⁢ particularly⁤ given their young ⁢audience. What are ⁤your thoughts on this situation?

**Dr. Chen:** Thank you‍ for having me. It’s a significant issue that highlights the⁢ responsibilities social media platforms have, especially when⁢ their primary users are children and teenagers. The ⁤demand for TikTok to disclose their strategies ‌for combating misinformation ‍is crucial, ⁢especially in⁣ light of growing⁤ concerns about the accuracy ‌of information shared on ‌their platform⁣ and the ‍potential exposure of young ⁤users to disturbing content.

**Interviewer:** Breton described the content circulating on ​TikTok​ as “gruesome,” referencing incidents like hostage-taking. How‍ serious is​ the impact ⁣of such content on impressionable users?

**Dr.​ Chen:** ​The impact can be ​profound. Young users are particularly ‌susceptible to harmful content, ⁢which can normalize violence or create a distorted perception‌ of⁣ reality. Research has shown that exposure to‌ violent imagery can‍ lead to heightened anxiety, desensitization,⁢ and⁢ even behavioral issues. Platforms have a duty to implement safeguards that​ not only filter ‌misinformation but also protect users from distressing content.

**Interviewer:** Considering‍ TikTok’s reported issues with misinformation, as noted in the CNN article, what steps should they ‍take to⁤ address these concerns effectively?

**Dr. Chen:** First, TikTok needs to enhance its content ⁢moderation policies to better identify ‌and ⁣mitigate both misinformation⁣ and violent content. Greater transparency in their algorithms⁣ could help users understand how content is selected‌ and ⁤displayed. ‌Additionally, collaboration with fact-checking organizations could play a crucial‍ role ⁢in combating⁢ misinformation effectively.

**Interviewer:** What additional responsibilities do⁤ social media platforms like TikTok have towards‍ their ⁣younger audience?

**Dr.⁤ Chen:** Beyond⁤ just content moderation, these platforms⁢ must prioritize educational initiatives.⁣ They should provide resources that help users critically evaluate the⁤ information they​ encounter.⁢ Implementing features that allow parents to monitor and control what their children​ are exposed​ to can also help create ⁢a ‌safer online environment. Ultimately,⁢ it’s about creating a‌ culture‌ of responsibility and awareness among users and the platform itself.

**Interviewer:** Lastly, how⁢ do you see the regulatory ⁢landscape evolving around ‌platforms ​like TikTok in light of these ‌challenges?

**Dr. Chen:** We’re likely to see increased scrutiny from ‍regulators. As European ⁤and other global entities ‍push for stricter controls, platforms will need to adapt quickly. ⁤Compliance with new regulations regarding data protection and user safety will be essential. The​ ongoing conversation ⁣around misinformation and content safety is shaping an era where accountability for online platforms is‍ not just ‌expected but demanded.

**Interviewer:** Thank you, Dr. Chen, for ⁤your insights‌ into this critical issue.

**Dr. Chen:** Thank​ you for having⁢ me. ‌It’s important that​ we continue discussing these challenges as technology and‌ social media evolve.

Leave a Replay