Former Facebook content moderator sues company for giving her PTSD
Former Facebook content moderator sues company for giving her PTSD

2025-04-27 04:05:35

The mental toll of moderating unsettling content on the internet is well-noted, and now it's the subject of a lawsuit against Facebook.

Former content manager Selena Scola has lodged a suit against the social media company, claiming "constant and unmitigated exposure to highly toxic and extremely disturbing images" had left her with post-traumatic stress disorder.

SEE ALSO: The only good thing left on Facebook is private meme groups

According to the filing in the Superior Court of California, the company is accused of ignoring workplace safety standards that help protect employees from psychological harm.

As part of her job, Scola reviewed "thousands of images, videos, and livestreamed broadcasts of graphic violence."

She was employed by a staffing agency, Pro Unlimited Inc., and began work at Facebook's offices in June 2017. Pro Unlimited was also named as a defendant in the case.

Scola was at the company for nine months, and eventually developed symptoms of fatigue, insomnia, and social anxiety, before being formally diagnosed with PTSD.

The suit claims that her PTSD symptoms may be triggered "when she touches a computer mouse, enters a cold building, watches violence on television, hears loud noises, or is startled."

Mashable Light Speed Want more out-of-this world tech, space and science stories? Sign up for Mashable's weekly Light Speed newsletter. By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy. Thanks for signing up!

"Her symptoms are also triggered when she recalls or describes graphic imagery she was exposed to as a content moderator," the claim reads.

Scola's lawyer Steve Williams, of Joseph Saveri Law Firm, said in a statement that his client wants Facebook to set up a medical monitoring fund to provide testing and care to content moderators with PTSD.

"Facebook needs to mitigate the harm to content moderators today and also take care of the people that have already been traumatized."

"Facebook needs to mitigate the harm to content moderators today and also take care of the people that have already been traumatized," he added.

Another lawyer on the case, Korey Nelson of the law firm of Burns Charest LLP, added, "Facebook is ignoring its duty to provide a safe workplace and instead creating a revolving door of contractors who are irreparably traumatized by what they witnessed on the job."

As of June, Facebook employs 7,500 content reviewers across the world, a number that it's doubling this year to 20,000.

The company uses a mix of full-time employees, contractors and companies to take care of the thousands of posts which need to be reviewed every day.

Facebook's director of corporate communications, Bertie Thomson, said in a statement via email that the company is "currently reviewing this claim," and that it recognizes that "this work can often be difficult."

"That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive, and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources," the statement added.

Thomson pointed to the in-house counsellors and other wellness resources it provides to employees, which it detailed in a post on the company's blog.


Featured Video For You
Meet the bionic model redefining beauty standards in Italy

app下载

官方APP 此处可放自己的二维码    发现生活方式

二维码下载

客户端下载: datuxo.undefeeted.org