Understanding the Impact of Online Abuse on Social Media Moderators

Understanding the Impact of Online Abuse on Social Media Moderators

Unless you’re a moderator for a local community group discussing garbage collection procedures or the finer points of dog park etiquette, you may remain oblivious to the overwhelming volume and scale of abuse directed at individuals online, particularly those involved in social media moderation.

When social media moderation and community management become integral aspects of your daily professional responsibilities, the toll on individuals—and their families—can be staggeringly profound. This is particularly true for journalists, especially those just starting out in their careers, as they often experience a barrage of harassment and abuse through various online platforms.

For those who come from culturally or linguistically diverse backgrounds, the reluctance to report such incidents can be magnified compared to their peers, further complicating the landscape of online abuse.

Employers are increasingly concerned about the severe impact that moderating distressing content can have on the wellbeing of their staff. It falls upon employers to safeguard their employees’ safety in the workplace, which now extends to the online realm.

In a bid to comprehend the realities moderators face, the ABC conducted an extensive internal survey. The data revealed the alarming extent of the challenges moderators encounter while striving to ensure that audience members can safely engage in online discussions.

What did the ABC find?

In 2022, the ABC reached out to 111 staff members who were responsible for online moderation as part of their job roles to self-report the frequency with which they encountered potentially harmful content.

A crucial aspect was understanding the time commitment required for content moderation. Among those tasked with daily moderation, a notable 63% spent less than an hour and a half, while 88% moderated content for less than three hours each day.

The survey highlighted that the majority of staff reported exposure to harmful content on a weekly basis, underscoring the persistent nature of the problem.

An alarming 71% of moderators disclosed that they experience criticism and denigration of their work on a weekly basis, and a staggering 25% face such negativity on a daily basis.

In terms of the type of harmful content observed, half of the respondents noted they encounter misogynistic content weekly, while more than half reported exposure to racist content within the same timeframe.

Furthermore, approximately one-third of the moderators indicated they witness homophobic content on a weekly basis. When it comes to abusive language, 20% of moderators encounter such instances weekly.

The concerning reality is that many moderators do not experience these negative encounters in isolation; they often face multiple types of abusive content simultaneously, compounding the psychological toll.

It’s noteworthy that the survey did not explicitly define what constituted racist, homophobic, or misogynistic content, leaving the interpretation largely up to the moderators themselves.

A global issue

For several years, the mental health challenges faced by moderators in various countries have been documented and acknowledged.

Some moderators employed by Facebook to screen out the most egregious material have resorted to legal action against the company.

In one notable case in the United States, Facebook agreed to a settlement exceeding $52 million for mental health treatment, involving more than 10,000 content moderators.

In Kenya, 184 moderators working under contract with Facebook have initiated lawsuits against the company due to inadequate working conditions and a lack of mental health support, demanding $1.6 billion in compensation.

The ongoing case exemplifies larger issues at play, as other separate cases against Meta are also progressing in Kenya.

During the peak of the COVID pandemic, moderators in Australia conveyed the emotional strain they encountered while addressing misinformation and threats from social media users.

A 2023 report published by Australian Community Managers, the leading organization representing online moderators, found that 50% of respondents identified maintaining mental health as a significant challenge in their roles.

What’s being done?

While it faces its own set of challenges, the ABC is taking pioneering steps to protect its moderators from experiencing harm.

The organization has long implemented a variety of programs designed to shield staff from trauma exposure, including a comprehensive peer support program for journalists. This initiative has received support from the Dart Centre for Journalism and Trauma Asia Pacific.

As the level of abuse aimed at staff has intensified, the national broadcaster has appointed a dedicated Social Media Wellbeing Advisor, believed to be the first in the world to hold such a position. Nicolle White is tasked with managing workplace health and safety risks associated with social media interactions.

Participants in the survey shared their thoughts on enhancing support systems for moderators.

Turning off comments was rated the most effective strategy to enhance wellbeing, followed closely by support from management, peer support, and preparation of responses to expected audience feedback.

However, disabling comments often leads to feedback from some individuals who feel their voices are being censored, despite the legal liabilities media publishers bear for comments attached to their content, following a 2021 High Court decision.

Educating staff about the motivations behind why people comment on news content has emerged as a critical aspect of reducing harm.

The peer support program also serves as a valuable connection point for staff with others who have relevant moderation experience.

Managers have been encouraged to promote the completion of self-care plans by their staff, aimed at preparing for challenging moderation days, such as during significant national events. These self-care plans emphasize the importance of identifying positive coping strategies, establishing boundaries post-shift, and engaging in debriefing sessions to reflect on the meaningfulness of their work.

Evidence suggests that one of the most significant protective factors for journalists is a reminder of the importance of their work.

The resounding message across the board is that ensuring moderators receive clear guidance on addressing any wellbeing impacts, and normalizing the act of seeking support, is crucial for creating a healthy work environment.

Lessons for others

While these findings are specific to the ABC, it’s evident that the experiences reported resonate across the broader news industry and in various forums where individuals bear the responsibility of moderating communities.

This issue extends beyond paid employees; volunteer moderators at youth radio stations and Facebook group admins also face substantial online hostility.

Clearly, any organization—whether a business or a volunteer-based entity—engaging with a social media audience must consider the health and safety implications for those entrusted with managing those platforms and embed robust support strategies.

In response to these challenges, Australia’s eSafety commissioner has developed a wide array of publicly accessible resources designed to assist organizations in navigating these complex dynamics.

**Interview with Nicolle White, Social Media Wellbeing Advisor at ABC**

**Editor**: Thank you for joining us today, Nicolle. Your role as the ‍Social Media Wellbeing⁢ Advisor is groundbreaking. ‍Can you tell ⁣us about the reasons behind the establishment of this position ⁤at ABC?

**Nicolle White**: Thank you for ⁣having ‍me. The role was created in response to the ‌increasing volume of‍ online ⁢abuse faced by our staff, particularly moderators. We recognized that the⁣ emotional and psychological‍ toll of moderating distressing content is significant and can affect not just the individuals but also their families. This position aims to address those challenges proactively.

**Editor**: The internal survey highlighted alarming statistics about the harsh realities moderators face. From your perspective, why is⁤ it critical to focus on their⁤ mental health?

**Nicolle White**: The mental health of our moderators is paramount because they are on the ⁢front lines, ensuring‍ that our audiences​ can engage⁤ safely online.⁢ When moderators experience harassment or witness harmful content, it can lead to burnout, anxiety, and depression. We want to create an environment where they‍ feel​ supported ⁣and​ valued, as⁣ this ultimately reflects on the quality of the dialogue we facilitate.

**Editor**: The survey revealed that many moderators​ encounter harmful content on⁤ a ⁤regular basis. What measures ​is ABC taking to help them manage this exposure?

**Nicolle White**:‌ We have implemented several‌ programs, one of which is a comprehensive peer​ support⁢ system. This⁤ allows moderators to connect with colleagues who understand their experiences. We also provide ‌training that educates staff on the motivations behind online comments and how to cope with negative interactions. Importantly, we’re exploring options like turning off comments on⁤ certain posts as a way‌ to reduce immediate stressors.

**Editor**: The⁢ survey indicated a diverse ‌variety of abuse moderators face, including misogynistic and racist ‍comments. How does cultural‍ diversity affect‍ the moderators’ experience with⁣ online abuse?

**Nicolle White**: Moderators from culturally ⁣or linguistically diverse backgrounds may face ⁣additional layers‌ of complexity when it comes to reporting abuse. There can ‌be a⁣ feeling of isolation or a reluctance to speak out about mistreatment, often rooted in cultural experiences. It’s essential that we recognize these differences and⁤ build tailored support mechanisms that resonate with all our staff.

**Editor**: It’s clear that this issue isn’t confined to Australia. How are we seeing these ‍trends manifest‍ globally?

**Nicolle White**: Indeed, the challenges⁣ moderators face are⁢ a global concern. As seen in cases ⁣with Facebook content⁤ moderators in⁣ the U.S. and ​Kenya, many are taking‌ legal action​ due‍ to inadequate support. It⁤ highlights⁣ a ‌broader need for organizations worldwide to prioritize the mental ​health and safety of their online ​moderators. We’re monitoring these‍ developments closely‌ and learning from other organizations’ experiences.

**Editor**: ‍Lastly, what do you think the media ⁣industry as ⁤a whole ‌should focus on ‍to improve conditions for moderators everywhere?

**Nicolle White**: The industry needs to⁤ acknowledge the extensive ⁤strain that moderation roles can impose. Before any policy ⁢changes can be made, there should ​be a ⁢strong commitment to mental⁤ health and wellness initiatives. This could ‌involve investment in robust‌ support systems, research into best ⁢practices, and fostering an⁣ organizational culture ‌that values ‍and prioritizes the wellbeing ‌of all staff.

**Editor**: Thank you, Nicolle, for shedding light ⁤on this critical issue. ‍Your work ​is ⁣incredibly important, and we⁣ look forward ⁢to seeing how ABC continues ⁣to lead the way⁣ in promoting the wellbeing of ‌its moderators.

**Nicolle White**: Thank you for having me! It’s a crucial conversation, and I ⁤appreciate the opportunity to‍ discuss it.

Leave a Replay