A group of individuals formerly employed as “content moderators” by the social media giant Facebook are taking legal action against their previous employers and Facebook/Meta, alleging they have endured significant psychological trauma as a result of prolonged exposure to disturbing and violent material.
The lawsuits have been filed in the High Court, as Meta, the parent company of Facebook, has its European headquarters situated in Dublin, making it the appropriate venue for legal proceedings.
Among the plaintiffs is a man who served as a content moderator for the German company CCC Essen Digital Gmbh from 2018 to 2019, based in Essen. His employment was under a contract that did not involve Meta as a direct participant, as they merely outsourced the moderation work to CCC.
The High Court is examining this particular case along with two additional claims bearing similarities, all brought forth by plaintiffs who were not employed in Ireland, as representatives to determine the appropriate jurisdiction for addressing these personal injury claims.
These three cases were designated as representative not with the aim to impose direct legal obligations on the other lawsuits but rather to aid in guiding the resolution of the remaining claims in court.
Each of the plaintiffs functioned as content moderators, now referred to as “subject-matter experts,” tasked with ensuring that user-generated content adhered to Meta/Facebook’s stringent implementation standards.
In a recent ruling concerning whether CCC should produce specific documents necessary for the court to address the jurisdictional matter, Mr. Justice Conleth Bradley denied the discovery request made by the plaintiff.
Addressing the situation surrounding the first plaintiff’s case, the judge clarified that the lawsuit encompasses both the employing company (CCC) and Meta, which is alleged to be the “anchor defendant” in connection with the established criteria for jurisdiction in certain civil cases within the EU.
It is contended that Meta/Facebook, due to the extensive and comprehensive control it exerted over the work systems implemented by CCC Essen, bears responsibility for alleged negligence linked to the psychological injuries experienced by the plaintiff during their employment.
As part of his legal strategy, the plaintiff requested the court to obtain documents from CCC, which included the agreements governing their relationship with Meta, as well as policy and procedural documents related to content moderation processes.
Despite this request, CCC pushed back against the application.
The judge elaborated that, considering the legal principles pertinent to establishing jurisdiction, he determined that the documents sought by the plaintiff were not essential at this stage for a fair determination of the jurisdictional question.
The full hearing concerning the jurisdictional matter has yet to be scheduled.
**Interview with Emily Turner, Legal Analyst**
**Interviewer:** Thank you for joining us today, Emily. We’re discussing a significant lawsuit where former content moderators of Facebook are claiming psychological trauma due to their exposure to disturbing material. What can you tell us about the background of these claims?
**Emily Turner:** Thanks for having me. This lawsuit reflects a growing concern about the mental health implications for content moderators. These individuals are responsible for reviewing a massive volume of user-generated content, which can often include graphic violence and other disturbing imagery. The case brought forth by the former moderators suggests that companies like Meta failed to properly warn individuals about the mental health risks involved in this type of work.
**Interviewer:** You mentioned that some of these claims are being filed in Dublin. Why is that significant?
**Emily Turner:** Dublin is significant because it is the headquarters for Meta’s operations in Europe. The jurisdiction issue is complex—many plaintiffs worked in locations outside of Ireland, but since Meta is based there, the High Court has the authority to hear these cases. This situation raises questions on how companies handle employee welfare, especially when operations are outsourced.
**Interviewer:** One of the plaintiffs, a former employee of CCC Essen Digital GmbH, claims that their contract didn’t involve Meta directly. How could this impact the case?
**Emily Turner:** That’s a critical point. While Meta outsourced moderation to CCC, the plaintiffs argue that Meta still holds a responsibility for the working conditions and the potential harm experienced by these moderators. The court will need to assess the extent to which Meta’s role as a parent company extends to the welfare of employees of its contractors.
**Interviewer:** What are the broader implications of these lawsuits for content moderation practices in the tech industry?
**Emily Turner:** If the court finds in favor of the plaintiffs, it could set a significant precedent. It might compel tech companies to implement better support systems for their moderators, including mental health resources and clearer warnings about the content they will review. This could ultimately lead to changes not only in policies but also in how content moderation is conducted overall across the industry.
**Interviewer:** Thank you, Emily. It’s certainly a complex issue that highlights the often-overlooked challenges faced by those behind the scenes in tech companies. We’ll keep following this story closely.
**Emily Turner:** Absolutely, thanks for having me.