Facebook content moderators’ work under shameful conditions – Government must act

12 May 2021
  • Facebook content moderators’ continue to work under shameful conditions.

Labour employment affairs spokesperson Marie Sherlock said there is an urgent need to extend health and safety legislation to cover content moderation activity in Ireland. Following representations at the Committee on Enterprise, Trade and Employment by Facebook content moderators, their union CWU and Foxglove, a content moderation legal advocacy group, Senator Sherlock said the Government must protect the workers who are the backbone of the social media giants.

Senator Sherlock said:

“I want to thank the witnesses for their powerful testimony in Committee today. Ireland has staked out a global reputation for attracting in many social media giants locating their EMEA HQ’s in Ireland. With this comes great responsibility to protect all those working in the tech sector.

“Content moderators are the backbone of social media companies. They are on the frontline protecting us from graphic and violent images, videos and content. It is their work that enables a safe experience for all users. However, these workers work under harrowing conditions and today we heard once again about the psychological and psychiatric impact on content moderators from the vile and abusive content that many have to deal with must be addressed.

“These workers are low paid and earn a fraction of what direct Facebook employees are paid. On top of this, they have to work in a culture of fear, instigated by the outsourced company who employs the content moderators.

“It is absolutely shameful that a situation has been allowed develop where Facebook can outsource a fundamental activity like content moderation to an external company. These workers are made sign that non-disclosure agreements preventing them from discussing their work with even their own families. Witnesses told us that these documents are then taken away from them – an apparent breach of the Terms of Information Act 1994.

“What is troubling beyond belief is that the outsourcing company has failed to provide appropriate clinical support for the very serious content that content moderators have to deal with. These people are looking at traumatic posts, posts that Facebook does not deem safe to any user of its site. Now, as a matter of urgency, we need to extend health and safety legislation to cover content moderation activity in Ireland. Content regulation needs to be recognised as a hazardous activity, it needs to be directly regulated by the Health and Safety Authority and it needs to be brought in-house within the social media giants.

“Ultimately, it is simply unacceptable that workers are having to sustain very serious impacts on their mental and physical health arising from the work they do and that there is no regulation in place. Government must respond urgently with legislation and adequate resources to the Health and Safety Authority to enforce. Ireland must lead the charge in protecting all workers in Silicon Docks.”

Stay up to date

Receive our latest updates in your inbox.
By subscribing you agree to receive emails about our campaigns, policies, appeals and opportunities to get involved. Privacy Policy

Follow us

Connect with us on social media