Facebook Moderators in Kenya Face Severe Mental Health Impacts
More than 140 Facebook content moderators in Kenya have been diagnosed with PTSD and othre mental health conditions, according to medical reports filed in court. The diagnoses were made by Dr. Ian Kanyanya, the head of mental health services at Kenyatta National Hospital in Nairobi, and are part of an ongoing lawsuit against Facebook’s parent company, Meta, and samasource Kenya, an outsourcing firm contracted by Meta to review content. Content moderators play a critical role by screening social media platforms for disturbing and harmful content. Often employed by third-party firms in developing countries, thes workers routinely expose themselves to graphic and disturbing material, raising concerns about the potential long-term impact on their mental well-being. “The moderators he assessed encountered ‘extremely graphic content on a daily basis which included videos of gruesome murders, self-harm, suicides, attempted suicides, sexual violence, explicit sexual content, child physical and sexual abuse, horrific violent actions just to name a few,'” the legal firm representing the moderators, Nzili and Sumbi Associates, stated. Dr. kanyanya found that 81% of the 144 moderators who volunteered for psychological assessments suffered from “severe” PTSD. While Meta declined to comment on the medical reports due to the ongoing litigation, a spokesperson emphasized that the company takes the well-being of moderators seriously. They stated that Meta’s contracts with third-party firms include expectations regarding counseling, training, and fair pay. The spokesperson also mentioned that moderators can customize their ”content review tool” to blur or display graphic content in black and white. Samasource, now known as Sama, did not respond to requests for comment.Lawyer Mercy Mutemi, from Nzili and Sumbi Associates, alongside fellow counsel during a pre-trial consultation with Meta’s legal counsel and a judge on April 12, 2023. - Tony Karumba/AFP/Getty Images
## The Hidden Toll: Inside Kenya’s Facebook Moderation Crisis
**[INT. ARCHYDE STUDIO – DAY]**
**HOST:** Welcome back to Archyde Live. Today, we delve into a deeply troubling issue – the mental health crisis facing Facebook content moderators in Kenya. Over 140 moderators have been diagnosed with PTSD and other mental health conditions, according to a shocking new report. Joining us to shed light on this situation is [Alex Reed NAME], a leading expert on digital ethics and online worker rights. Thank you for being here today.
**Alex Reed:** Thanks for having me.
**HOST:** Let’s start with the basics. Can you tell our viewers what these moderators actually do and why their work is so crucial?
**Alex Reed:** Absolutely. These men and women are on the front lines of the internet. They sift thru massive amounts of content on Facebook, identifying and removing harmful material – everything from hate speech and violent threats to child exploitation and graphic violence. Its a vital job that keeps our online spaces safer, but it comes at a tremendous personal cost.
**HOST:** The report mentions some truly disturbing examples of the content these moderators are exposed to daily. Can you elaborate on the kind of psychological impact this can have?
**Alex Reed:** Imagine witnessing horrific acts of violence, sexual abuse, and suicide day in and day out. The constant exposure to such trauma can lead to severe PTSD,anxiety,depression,and even suicidal thoughts. We’re seeing a devastating pattern of mental health issues directly linked to this work.
**HOST:** Meta, Facebook’s parent company, says they take the well-being of moderators seriously. They claim to have policies in place for counseling, training, and fair pay. Do you believe these measures are sufficient?
**Alex Reed:** while I applaud any efforts to improve conditions for moderators, the reality is that these measures often fall short. Many moderators are employed by third-party companies, sometimes in precarious work settings with limited access to support. We need much stronger regulations and oversight to ensure these vulnerable workers are adequately protected.
**HOST:** This lawsuit against Meta and Sama, the outsourcing firm, represents a significant step towards accountability. what kind of justice are these moderators seeking?
**Alex Reed:** They are seeking recognition, compensation for their suffering, and systemic changes to prevent future harm. They want Meta to take obligation for the mental health consequences of its content moderation policies and to create a safer, more sustainable working habitat for those who keep our online world safe.
**HOST:** A powerful message.Thank you, [Alex Reed NAME], for sharing your expertise and insights on this crucial issue.
**[END INTERVIEW]**