Facebook to pay $52 million to moderators who developed PTSD on job

Facebook to pay $52 million to moderators who developed PTSD on job


Facebook has agreed to pay $52 million to its content moderators whose job has them viewing graphic and disturbing posts and videos on its platforms.

In a 2018 lawsuit, third-party contractors for the company said that Facebook failed to properly protect them against severe psychological and other injuries that can result from repeated exposure to graphic material such as child sexual abuse, beheadings, terrorism, animal cruelty and other disturbing images.

The settlement grants US moderators who were part of the class action lawsuit $US1,000 each.

But those who had been diagnosed with conditions related to their work will be able to get medical treatment and damages of up to $US50,000, according to the preliminary settlement filed in the Superior Court of California for the County of San Mateo.

“We are so pleased that Facebook worked with us to create an unprecedented program to help people performing work that was unimaginable even a few years ago,” said Steve Williams, a lawyer for the plaintiffs, in a statement. “The harm that can be suffered from this work is real and severe.”

In a statement, Facebook said it was “grateful to the people who do this important work to make Facebook a safe environment for everyone”.

“We’re committed to providing them additional support through this settlement and in the future,” the company said.

In addition to payment for treatment, moderators with a qualifying diagnosis will be eligible to submit evidence of other injuries they suffered for their time at Facebook and could receive up to $50,000 in damages.

The exact amount of the payout depends on how many members of the class apply for benefits, and it could shrink significantly if the majority of the class is found to be eligible for benefits.

In the settlement, Facebook also agrees to roll out changes to its content moderation tools designed to reduce the impact of viewing harmful images and videos. The tools, which include muting audio by default and changing videos to black and white, will be rolled out to 80 percent of moderators by the end of this year and 100 percent of moderators by 2021.

Moderators who view graphic and disturbing content on a daily basis will also get access to weekly, one-on-one coaching sessions with a licensed mental health professional. Workers who are experiencing a mental health crisis will get access to a licensed counselor within 24 hours, and Facebook will also make monthly group therapy sessions available to moderators.



Post a Comment

أحدث أقدم