A Facebook moderator court case has been submitted in Dublin by a former external contractor for Facebook who was employed, by an agency, to review ‘extremely disturbing, graphic and violent content’ on a daily basis.
The man, 53-year-old Chris Gray is seeking compensation to the psychological injuries he claims he suffered from due to his work duties. Mr Gray today filed submitted his legal action to the High Court against the Irish subsidiary of Facebook and the agency he was employed by, CPL Solutions.
He alleges that he suffered psychologically injuries as a direct result of the “very disturbing” photographs and videos, including executions, lethal beatings, stonings, whippings, the abuse of children, animal torture and extreme sexual content” that he had to view during his time moderating Facebook content. An example of the content given is video of the large-scale and coordinated abuse and murder of the Rohingya people in Myanmar, massacres in the Middle East and the torture of migrants in Libya.
Facebook’s network of content moderators includes 15,000 individual based around the globe. This network must filter through all content published on the platform in order to remove inappropriate graphic content. They are expected to achieve a 98% accuracy rating in relation to making the correct decision.
Representing by Coleman Legal Partners, and supported by UK not-for-profit group Foxglove, Mr Gray claims he identified a “slow creep” which involved “personal and political views were becoming increasingly influenced by the insidious content he was required to view.”
Part of the suffering he experienced included trouble with sleeping resulting from the nature of what he had views as part of his working day and the pressure to make the correct decision regarding the suitability of the content for publication. He said he would often wake during the night “with a fright, concerned not by the content, but by whether or not he had marked it correctly during his shift”.
The lack of appropriate training was criticised as inadequate and compounded by the lack of support to help moderators deal with “what seemed like a relentless flow of extreme and graphic material”. Mr Gray said that he was unable to communicate his distress to his superiors due to he irrational mood, caused by viewing the content.
Lawyer Cori Crider, a director of Foxglove said: “In a few years’ time we are going to look back on these conditions and see them the way that we now see early unsafe factory work in a steel mill or a meat-packing plant in the early 20th century.”
CPL was unavailable for comment today but a spokeswoman for Facebook said that the social media giant are providing training and support but remain conscious of the fact that moderating “certain types of content can sometimes be difficult”. They are, she said, allowing for thorough training and full-time support to moderators along with technical solutions to restrict the amount of graphic material they must view.