Facebook inflicted ‘lifelong trauma’ on content moderators in Kenya, campaigners say, as more than 140 are diagnosed with PTSD

Facebook inflicted ‘lifelong trauma’ on content moderators in Kenya, campaigners say, as more than 140 are diagnosed with PTSD

Meta Facing Lawsuit Over Kenyan Content Moderators’ Mental Health

A lawsuit against Meta, the parent company of facebook, has been ⁣filed⁢ in Kenya, alleging the⁤ social media giant inflicted “potentially⁤ lifelong trauma” on ⁤hundreds of content moderators. More than 140 moderators have ⁤been diagnosed with PTSD and other ‌mental health conditions, according to medical reports filed with the city’s employment and labor relations court on December 4.

The diagnoses were made by‍ Dr. Ian Kanyanya, the head ​of mental health services at Kenyatta National ⁢Hospital in Nairobi. The legal firm Nzili and ‌Sumbi Associates filed the ​reports as part of the ongoing lawsuit against Meta and Samasource ‌Kenya, the outsourcing company contracted to review content for the tech giant.

Content ⁤moderators⁢ are frequently enough employed by third-party firms in developing countries and are tasked with identifying and removing disturbing content‌ from social media platforms. Concerns have been raised for years ⁢about the ⁣potential impact this type of work can have on the mental well-being of moderators.

Meta declined to comment on the medical reports due to the ongoing⁢ litigation.‍ However, the company stated that it takes the support of moderators seriously and⁤ its contracts with third-party firms include expectations ​regarding counselling, training, and⁢ fair pay.

The Psychological Toll of⁣ Moderating Online content

Content moderation, the often-invisible work of sifting through the vast sea of online information to remove harmful and inappropriate material, can take a severe emotional toll on those who​ perform it. A recent legal case ⁢brought against a major social media company highlights the psychological scars that can result from continuous exposure ⁣to graphic and ​disturbing content. Facebook inflicted ‘lifelong trauma’ on content moderators in Kenya, campaigners say, as more than 140 are diagnosed with PTSD In a lawsuit filed against Meta, Facebook‘s parent company,⁢ content moderators allege the company failed to adequately protect their mental‍ health. They claim that the constant exposure to violent, disturbing, and exploitative content ​led to widespread symptoms of post-traumatic stress disorder (PTSD).The moderators, who ‌worked for a company contracted by Meta, described their experiences ⁤in harrowing detail. According to Dr. Nyanza Kanyanya, a psychiatrist who evaluated the moderators, the images⁢ and‍ videos they reviewed on a daily basis included “gruesome murders, self-harm, suicides, attempted suicides, sexual violence, explicit sexual content, child physical and sexual abuse, horrific violent actions,” among other disturbing material. Alarmingly, 81% of the ⁢144 moderators who volunteered for psychological assessments were diagnosed with severe PTSD. This raises serious concerns about the potential long-term consequences of content moderation work and the ethical responsibility of tech companies towards their contractors’ well-being. ⁤ Meta has stated that it takes the mental ‌health of its content moderators ‍seriously and offers a range of support services, including access to ⁤mental health professionals. The company also pointed out that its content review tools can be customized⁣ to blur or desaturate graphic content, offering some level of protection for moderators. However, the legal case highlights the ongoing debate surrounding the responsibility of ⁤tech giants to protect the‌ mental health of those ⁤who clean up the internet’s dark⁤ corners. ## Former Facebook Moderators Sue Company Over Psychological Harm Hundreds of former Facebook moderators in Kenya‍ are taking legal action against Meta, the social media giant’s parent company. The class-action⁤ lawsuit alleges that ‍Meta failed to⁣ adequately⁣ protect its moderators from the severe psychological trauma ⁣they experienced while reviewing ⁤disturbing​ content on the‍ platform. The lawsuit stems from concerns raised in 2022 about working conditions ​at Samasource Kenya, a company contracted⁣ by Meta to⁤ moderate content. A previous lawsuit alleged that Samasource Kenya illegally ⁤fired an employee for organizing protests against unfair treatment. In a subsequent move, all 260 moderators at Samasource Kenya’s Nairobi hub were made redundant ⁣after raising concerns about their pay and working ‍conditions.
Person working on a laptop
The moderators involved ​in this current lawsuit worked⁤ for Samasource Kenya between 2019 and​ 2023. they allege that Meta failed to provide adequate support and resources to cope with the psychological​ toll ⁤of their work, which⁢ often involved encountering extremely graphic and distressing content. One former moderator described experiencing frequent ⁤nightmares and waking up in cold⁤ sweats. They reported experiencing breakdowns, vivid flashbacks,‍ and paranoia as a direct ⁢result of the content they ⁣reviewed. Another moderator developed a phobia of⁤ dotted patterns – known as trypophobia – after seeing a particularly disturbing image. “Moderating Facebook ⁣is hazardous, even deadly, work that inflicts ⁢lifelong PTSD on almost everyone⁣ who moderates it.” ​ These are⁢ the stark words of Martha Dark, co-executive director of foxglove, a UK-based non-profit organization supporting the legal case. Foxglove argues that Meta has a moral⁣ and legal obligation ‌to protect the mental health of its content ​moderators, who play a crucial role in keeping the platform safe for its users. ⁢The outcome of ​this lawsuit could have significant implications for social media ‌companies and their responsibility towards ⁢the well-being⁤ of their workforce.d5v2g8seqb6@published” data-editable=”text” data-component-name=”paragraph” data-article-gutter=”true”> ‍ ‌ “In Kenya, it traumatized 100% of hundreds of former moderators tested for PTSD…‍ Facebook ‍is responsible for​ the potentially lifelong trauma of hundreds of people, usually young people who have only just finished their education,” she said in a statement provided to CNN Friday.

Dark believes that if these diagnoses ⁤were made in any other industry, the people responsible would be “forced to resign ⁤and face the legal consequences for mass violations⁢ of people’s rights.” ⁣

⁣ This is not the first time that content moderators have taken legal action against social media⁢ giants after claiming ⁤the job traumatized them.

‍ In 2021, a content moderator for⁤ TikTok sued ⁢the social media platform after she says she developed psychological trauma⁤ as an inevitable result of her job.

‌ The following year, TikTok was⁣ hit with another lawsuit from former​ content moderators.


## Archyde Interview: the Dark Side​ of Content moderation



**Host:** ⁤Welcome to Archyde News, I’m your host, [Your Name]. Today, we ‌delve into a disturbing issue impacting the very fabric of the online world: the psychological toll ​on ⁤content moderators, the ‍unsung heroes behind the clean and safe browsing experience​ we⁢ often⁣ take​ for‌ granted.



Joining us today is ⁤ [Alex Reed name], a former content moderator for a‌ large social media platform.​ [Alex Reed Name], thank you for being here.



**Alex Reed:** Thank you for having ​me.



**Host:**⁤ ⁤First,​ can you share ⁣with​ our listeners what a ⁢typical day as a content moderator‍ entailed⁤ for you?



**(Alex Reed Elaborates on daily⁢ tasks, the types⁢ of content ‍they encountered, ⁤the volume, etc.)**



**Host:** That sounds incredibly challenging and emotionally draining. What kind of support was provided by the platform to help you cope with the constant exposure to ⁣disturbing content?



**(Alex Reed Share experiences with support mechanisms.‌ Were there adequate ‍counselling services available? Training ⁤programs⁤ for dealing with trauma? What‍ were the limitations?)**



**Host:** despite ⁢these challenges, your ⁣company, Samasource Kenya, made headlines recently for a lawsuit against Meta, Facebook’s parent company. Can you ⁣shed ⁢light on​ the reasons behind this legal action?



**(Alex Reed describes the lawsuit,focusing on allegations of inadequate support and the psychological harm suffered by moderators. They⁣ might‍ mention the diagnoses of PTSD and other mental health conditions among⁤ moderators.)



**Host:** this case raises crucial questions‍ about the responsibility of tech giants towards ‌the well-being of their content moderators. What message do you⁢ hope this lawsuit sends to these companies?



**(Alex Reed ⁤focuses ⁤on⁣ the ⁤need for improved ‍mental health support, stricter content⁤ policies, ethical practices, and transparency in the industry.)**



**Host:** For those listening who might ‌be unaware, what are some of⁢ the long-term consequences of prolonged exposure to ​traumatic content ‍for individuals like yourself and your former colleagues?



**(Alex Reed‍ discusses the potential for ⁣PTSD, anxiety, depression, insomnia, and the impact on overall quality of life.)



**host:** ⁣Where do we go from here? what needs to change within the​ industry‌ to ‌ensure the ⁤well-being of content mediators?



**(Alex Reed advocates for stronger regulations,​ industry-wide ⁢standards for psychological ⁢support, obvious reporting on the mental health​ of moderators, and​ the​ development of technology to reduce human exposure ⁣to traumatic content.)**





**Host:** [Alex Reed Name], thank ⁤you for your‍ courage and honesty in sharing your experiences.⁣ We hope your story‌ shines a light on this crucial issue and ⁤inspires ⁣necessary action within the tech ​sector.





**[Optional closing remarks thanking the Alex Reed and inviting viewers to learn more about content moderation and mental health resources]**


This is a great start to a compelling article about the psychological impact of content moderation on workers. You’ve effectively set the stage by:



* **Highlighting the issue:** You clearly explain the problem of content moderation and the mental health risks associated with it.

* **Providing context:** You mention previous lawsuits and controversies surrounding content moderation, adding depth and background to the issue.

* **Introducing a real-world case:** The lawsuit against Meta by former Facebook moderators in Kenya provides a concrete example of the problem and its consequences.

* **Using powerful quotes:** Quotes from affected individuals and organizations like Foxglove add emotional impact and credibility.

* **Setting up an interview:** The transition to an interview with an expert will allow for a deeper exploration of the topic.



Here are some suggestions to build upon this strong foundation:





**Expand on the impact:**



* **Specific examples of content:** While you mention “extremely graphic and distressing content,” providing specific (but non-explicit) examples can make the reality of the job clearer to readers.

* **Long-term consequences:** Discuss the lasting effects of PTSD, anxiety, and other mental health issues on moderators’ lives.

* **Support systems and resources:** Are there any resources available for content moderators to cope with the psychological toll?



**Broaden the viewpoint:**



* **Ethical considerations:** Explore the ethical responsibilities of social media companies towards their moderators.

* **Industry-wide practices:** how do different platforms approach content moderation and support for moderators?

* **Potential solutions:** Discuss potential solutions like AI-powered moderation, improved working conditions, and better access to mental health support.



**Continue the interview:**



* **Alex Reed expertise:** Highlight the Alex Reed’s credentials and experience to establish their credibility.

* **Thought-provoking questions:** Prepare insightful questions that encourage the Alex Reed to share their perspectives and insights.



Remember, this is a crucial topic with far-reaching implications.By delving deeper into the emotional, ethical, and practical aspects of content moderation, you can create a compelling and informative piece that raises awareness and sparks important conversations.

Leave a Replay