Former Facebook Content Moderator Files Lawsuit for PTSD

/ 4 years ago

Former Facebook Content Moderator Files Lawsuit for PTSD

Screening “Highly Toxic” Content

To a big company like Facebook, a lawsuit is just part of doing business. However, the charges often come from outside the company.

The latest class action lawsuit brought to their doors this time come from a former employee. Specifically, a former content moderator by the name of Selena Scola. She alleges that while filtering through content for the social media platform, she was “exposed to highly toxic, unsafe and injurous content”.

Scola’s lawyers claim that she has developed Post-Traumatic Stress Disorder as a result of the job. Part of the work involves not just screening and deleting hate speech, but also looking for offensive images and videos. This can span from innocuous suggestive photos, to extreme violence. According to the lawsuit, this even includes exposure to “millions of videos, images and broadcasts of child sex abuse, rape, torture, bestiality, beheadings, suicides and murder”.

The lawsuit alleges that the company does not provide its content moderators with “sufficient training” in handling the traumatic content. Lawyers claim that Ms. Scola’s PTSD can trigger whenever she touches a computer mouse, hear loud noises or enter a cold building, among other things.

Scola is the only employee named as plaintiff, but it is a class-action lawsuit representing up to thousands of Facebook content moderators in California. Facebook’s content moderation operation actually involves thousands more employees worldwide.

Former Facebook Content Moderator Files Lawsuit for PTSD

What is Facebook’s Position Regarding this Issue?

A Facebook spokesperson has reached out to Vice magazine and responded that they are currently “reviewing this claim“. Adding that they recognize that content moderation is “difficult”, and that they offer psychological support and wellness resources to their workers.

The spokesperson also adds that they have specific training protocols for would-be content moderators. However, the suit alleges that this is insufficient for the amount of traumatic content the employees see.

The social media giant has also increasingly been working on using AI to filter disturbing content. This screens the potential offensive content, even before it goes through human moderators. Thus, minimizing the potential trauma it can inflict.

Topics: , ,


By supporting eTeknix, you help us grow and continue to bring you the latest newsreviews, and competitions. Follow us on FacebookTwitter and Instagram to keep up with the latest technology news, reviews and more. Share your favourite articles, chat with the team and more. Also check out eTeknix YouTube, where you'll find our latest video reviews, event coverage and features in 4K!

Looking for more exciting features on the latest technology? Check out our What We Know So Far section or our Fun Reads for some interesting original features.

eTeknix Facebook eTeknix Twitter eTeknix Instagram eTeknix Instagram
  • Be Social With eTeknix

    Facebook Twitter YouTube Instagram Reddit RSS Discord Patreon TikTok Twitch
  • Features

Send this to a friend