Facebook Content Moderators in Kenya Suffer from PTSD, Lawsuit Alleges

Facebook Content Moderators in Kenya Suffer from PTSD, Lawsuit Alleges

Facebook Moderators ⁤in⁣ Kenya Face ‌Severe Mental Health​ Impacts

More than ‌140 Facebook content moderators in Kenya have‍ been diagnosed with ‌PTSD and othre mental health ⁤conditions, ‍according to medical reports filed in court. ⁣The diagnoses were ⁣made by Dr. Ian Kanyanya, the⁣ head of mental health⁤ services at Kenyatta National Hospital in Nairobi, and are ⁢part‍ of an ongoing lawsuit against ‍Facebook’s parent company, Meta, ​and samasource Kenya, an outsourcing firm contracted by Meta to ⁤review ‍content. Content​ moderators play‍ a critical‌ role by screening social ⁤media⁤ platforms for disturbing and harmful content. Often employed⁣ by third-party firms in developing countries, thes workers‌ routinely ⁣expose themselves to graphic and disturbing material, raising concerns about the potential long-term impact on ⁤their mental well-being. “The moderators he assessed encountered​ ‘extremely graphic content on a⁤ daily basis which included ⁢videos of​ gruesome murders, self-harm, suicides, attempted suicides, sexual violence,‌ explicit sexual content, child physical and sexual abuse, horrific‌ violent actions ⁣just to name a few,'” the⁣ legal firm‍ representing ‍the moderators, Nzili ⁢and Sumbi Associates, stated. Dr. kanyanya found that ‌81% of the 144 moderators who⁢ volunteered for​ psychological assessments‍ suffered ⁤from “severe” PTSD. While Meta declined to comment on‌ the medical reports due to the ongoing⁢ litigation, a spokesperson emphasized ‍that the company takes ‍the well-being of⁢ moderators seriously. ‍They ⁢stated that Meta’s contracts ⁢with third-party firms⁢ include expectations regarding counseling, training, and fair pay. The spokesperson also mentioned that​ moderators can ⁣customize their ⁢”content review⁤ tool” ⁢to blur ⁢or display⁢ graphic content in black and white. Samasource, now known as Sama, did not respond to requests for comment.

Lawyer Mercy Mutemi, ‌from Nzili and Sumbi Associates, alongside fellow counsel during a pre-trial consultation with Meta’s legal counsel and ⁤a judge on April⁣ 12,​ 2023. ​- Tony Karumba/AFP/Getty Images

This case highlights the urgent​ need for greater transparency and ⁣accountability in ⁤the tech industry regarding the mental health of content moderators. ​As social​ media platforms continue⁣ to grow and expand, ​it is crucial to ensure that the individuals tasked with keeping these platforms safe are adequately supported and protected. Content moderation, the often-thankless task of filtering ⁣harmful content from online platforms, has come under​ scrutiny once ⁤again. A lawsuit filed in Kenya‌ alleges that Facebook’s content‌ moderation practices have inflicted severe and‌ lasting psychological trauma on moderators.⁣ Led by the UK non-profit Foxglove, the lawsuit ‌cites the ​experiences of​ former ‍moderators‌ who worked at Samasource Kenya,‌ a company contracted ⁣by Facebook to handle content ⁣moderation. According to Foxglove,all 260​ moderators ⁤employed at Samasource Kenya’s Nairobi hub were‍ made redundant⁤ in 2022,allegedly as ​punishment for raising concerns about their pay and working conditions. ⁢⁤ This legal action follows an earlier case launched in 2022 by‌ a former Samasource Kenya employee‍ who was reportedly dismissed for organizing protests. The ⁤lawsuit draws on medical records and testimonies from ⁤former moderators.‌ one individual described ​waking⁤ up ‌from nightmares caused by the graphic content they were ⁣exposed to, leading to ⁤breakdowns, flashbacks, and paranoia. Another​ former moderator developed‌ trypophobia, a fear of clustered⁣ patterns, after ⁤encountering a disturbing⁤ image of maggots. martha Dark, co-executive director‌ of Foxglove, states, “Moderating facebook is dangerous, even⁤ deadly work that inflicts lifelong PTSD on almost everyone who moderates it. In Kenya, it ⁢traumatized ‍100% of hundreds ​of former moderators⁢ tested for PTSD… Facebook is responsible for the ⁤potentially lifelong trauma of ​hundreds of ⁤people, ⁣usually young ​people who have ⁤only just⁣ finished​ their education.” Dark contends ⁤that if ⁤such trauma were⁢ inflicted in any other industry, those responsible would face ⁤serious consequences. This legal challenge‍ is not an isolated incident. In 2021 and 2022, separate lawsuits were filed against ⁢TikTok by content moderators​ who alleged psychological harm caused by their work.
##⁣ The Hidden⁢ Toll: Inside ⁣Kenya’s Facebook Moderation Crisis



**[INT. ARCHYDE STUDIO – DAY]**



**HOST:** Welcome back to Archyde Live. Today, we delve into a deeply troubling issue – the mental health crisis facing Facebook content moderators in Kenya.⁢ Over 140 moderators have been diagnosed with PTSD and other mental health conditions, according to a shocking ​new report. Joining us to shed light on this situation is [Alex Reed NAME], a leading⁣ expert on ‌digital ethics and online worker⁢ rights. Thank you for being here today.



**Alex Reed:** Thanks⁣ for having me.



**HOST:** Let’s start with the basics. Can you tell our viewers what these moderators actually do and why their work is ‍so crucial?



**Alex Reed:** Absolutely. These men and women are ​on the front ⁤lines of the internet.​ They sift thru massive amounts of content on Facebook, identifying and removing⁤ harmful material ⁤– everything from hate speech and‍ violent threats to child exploitation and graphic violence. Its a⁣ vital job that keeps our online ⁣spaces​ safer, but it comes at a tremendous personal cost.



**HOST:** ‍The report mentions some truly disturbing ⁢examples of the content these moderators are exposed to daily. Can you elaborate on the ​kind of psychological impact​ this can have?



**Alex Reed:** Imagine ​witnessing horrific acts of‌ violence, sexual abuse, and suicide day in and day out. The ⁣constant exposure to such trauma can lead to severe ⁤PTSD,anxiety,depression,and even suicidal thoughts. We’re seeing a devastating pattern of mental health issues directly linked to this work.



**HOST:** Meta, ‌Facebook’s ‍parent ​company, says they take the well-being of moderators seriously. They‍ claim to have policies ​in place for counseling, training, and fair pay. Do ‍you believe these measures are sufficient?



**Alex Reed:** while I applaud any efforts to improve conditions for moderators, ⁤the⁤ reality is ​that these measures often fall short.⁤ Many moderators are employed by third-party companies, sometimes in precarious ⁢work settings with limited access to ‌support. We need much stronger ⁣regulations and‌ oversight to ensure‌ these vulnerable workers are adequately protected.



**HOST:** This lawsuit against Meta⁣ and Sama, the outsourcing firm, represents a significant step towards⁤ accountability. what kind of justice are these⁤ moderators seeking?



**Alex Reed:** They are ​seeking recognition, ⁤compensation for their suffering, and systemic changes‍ to prevent future harm. They want Meta to take obligation for the ‍mental health consequences of its content ‌moderation policies and to create a safer, more sustainable working habitat for those who keep our online world safe.





**HOST:** A powerful message.Thank you, [Alex Reed NAME], for sharing your ‍expertise and insights on this crucial issue. ⁣



**[END INTERVIEW]**

Leave a Replay