The current and former content moderators described stressful hours spent focused on torrents of hateful, disturbing posts with little regard given to their feedback or their well-being.
They called for Facebook to find a way to make them and their colleagues full-time employees, complete with the benefits for which tech companies are renowned, instead of keeping them at arms-length by outsourcing the work.
“Facebook could fix most of its problems if it would move away from outsourcing, value its moderators, and build them into its policy processes,” said former content moderator Allison Trebacz.
“Moderators are the heart of Facebook’s business - that’s how they should be treated.”
Zuckerberg has pushed back against concerns about hateful or violent posts at the social network by saying the social network has invested heavily in artificial intelligence and real humans to take down content violating its policies.
The bulk of that army of content moderators are contracted and their viewpoints—hard-won on the frontlines of the battle—are typically ignored, according to those who took part in the press briefing.
“I became a Facebook content moderator because I believed I could help make Facebook safer for my community and other communities who use it,” said Viana Ferguson, who left the job last year.
“But again and again, when I tried to address content that dripped with racism, or was a clear threat, I got told to get in line, our job was to agree.”
Zuckerberg and Twitter chief executive Jack Dorsey are to testify Wednesday before a Senate committee exploring the potential to weaken legal protections given to online platforms when it comes to what users post there.