EU warns TikTok over illegal content and misinformation

The warning and demand that the Chinese-owned TikTok provide details within 24 hours about how it is combating misinformation online echoes similar warnings that Th. Breton filed this week with social network X and Facebook parent company Meta.

“Given that your platform is widely used by children and teenagers, you have a special responsibility to protect them from violent content depicting hostage-taking and other gruesome videos that are reported to be widely distributed on your platform without adequate safeguards,” – in a letter to the head of TikTok, Shou Zi Chew, Th. Breton.

Th. Breton stressed that TikTok and other major internet platforms need to comply with the Digital Services Act (DSA), which gives the EU the power to fine platforms up to 6 percent. their global circulation if they do not fight against illegal content.

Like Elon Musk, the owner of the X platform, and Mark Zuckerberg, the head of Meta, Th. Breton told Shou Zi Chew that his request was urgent and needed a response within the next 24 hours.

The full text of the letter was published in Th. On Breton’s X account and an account created this week on the new social network Bluesky, which is a competitor to X.

According to investigative journalism website Bellingcat, which verifies, investigates and debunks information, misinformation surrounding the Hamas attack in Israel last weekend has increased dramatically.

It documented examples of false or misleading videos being posted on X, TikTok and other platforms.

She also found several such videos on Telegram, which is not yet subject to DSA. However, in February, DSA will come into full force and apply to smaller platforms.

#warns #TikTok #illegal #content #misinformation

How can TikTok enhance its ‌content moderation​ to better ‌protect young users⁣ from misinformation and harmful content?

**Interview with Dr. Emily​ Carter, Digital Media⁤ Expert**

**Interviewer:** Thank ⁢you for‍ joining us today, ⁢Dr. Carter. Recently, European Commissioner Thierry Breton issued a stark warning to TikTok regarding its handling of misinformation, ⁤especially with the platform’s significant ⁣youth user base. ‍What‌ are your initial thoughts on this situation?

**Dr. Carter:** Thank you for​ having ‌me.‍ This warning ⁢underscores a critical issue ⁣we’re facing in⁤ the digital age.‍ TikTok’s rapid rise as a primary social media platform, especially among⁢ younger users, demands heightened accountability. Commissioner ⁣Breton’s emphasis on ‌the⁢ platform’s ⁢responsibility ‍reflects growing concerns that⁣ many platforms ‍aren’t doing enough to protect young users from‍ harmful content, including misinformation and graphic‍ violence.

**Interviewer:**‌ Breton’s letter highlighted the responsibility ‌to⁤ protect children and ⁤teenagers from violent content. ‍How ⁣do⁣ you see TikTok’s role in⁢ this regard compared to other platforms like X or Meta?

**Dr. Carter:** Each platform has unique challenges and strengths when ‍it comes to content moderation. TikTok’s format, which ⁢relies heavily on rapid video consumption, can exacerbate the spread of manipulated ⁢or misleading​ content.​ As we’ve seen, platforms like X and Meta have⁤ also faced scrutiny. However, TikTok’s algorithm prioritizes engagement, ‍which can ⁤lead to the ⁣circulation of sensational but‌ harmful ​content. It’s crucial for ​TikTok⁣ to leverage its technology to prioritize user​ safety more effectively.

**Interviewer:** The commissioner has ⁢given TikTok a 24-hour ultimatum ⁢to detail its measures against misinformation. How effective do ⁢you ⁤think such pressure can be in driving change?

**Dr. Carter:** ⁣While a 24-hour ultimatum⁣ might sound reasonable, the effectiveness of such pressure largely depends on the follow-up actions. Regulatory pressure can lead to ​immediate changes, but for long-term impact, there needs to be a sustained dialog between regulators and platforms. It’s about establishing a ​framework for accountability and ‌transparency that evolves as technology changes.

**Interviewer:** Considering the‌ growing concern over deepfakes⁤ and misinformation on social media,‍ what steps can‌ TikTok​ take to improve its safeguards?

**Dr. Carter:** TikTok can implement a multi-faceted approach. First, enhancing its ​content moderation​ technologies to swiftly detect and manage harmful ‌content is vital. Engaging with fact-checkers, particularly for videos that‍ feature newsworthy content, and increasing user education about misinformation are also crucial. ⁢Furthermore, greater transparency about‍ its algorithms and how content is promoted could help ​build trust ​among its users.

**Interviewer:** how do​ you see the future ⁢of regulation impacting platforms like TikTok?

**Dr. Carter:** The future of regulation‍ is likely‍ to be more proactive and ⁤comprehensive. We are already witnessing a shift towards⁣ stricter ⁤guidelines on how platforms manage content.‌ The challenge will be ‍balancing ‌regulation with innovation—ensuring ​platforms ​can still provide creative⁢ and engaging content ⁣while making ‌user ⁢safety ‌a priority.⁢ It’s a‍ complex but necessary evolution that we​ need to embrace.

**Interviewer:** Thank you, Dr. ⁤Carter, for sharing your insights on this critical issue.

**Dr. Carter:** Thank ‍you for having me!

Leave a Replay