Facebook's largest content moderator has reportedly struggled with the ethics of its work for the company, which requires contractors to sift through violent, graphic content


Insiders at Facebook's largest content moderator have questioned if working for the company is ethical, as criticism of the taxing and violent nature of the job intensifies.

In a sweeping investigation published Tuesday, the New York Times spoke to current and former employees at Accenture, Facebook's single biggest partner charged with reviewing toxic material on its platform.

Many expressed concern about the work they conducted for Facebook — thousands of contractors across the world are tasked with looking at violent, graphic, and sexual posts and videos and deciding if they should be removed or remain up. The report supports multiple accounts that have surfaced over the years about how the work has caused depression, anxiety, and other negative mental health effects.

Three sources told The Times that even former Accenture CEO Pierre Nanterme was skeptical that the firm's work with Facebook was ethical back in 2017. And current CEO Julie Sweet also raised questions to company executives in 2019 about some of Accenture's involvement with Facebook, citing in part how Accenture's reputation could be harmed, per the report.

The execs reportedly said the concerns could be addressed — and that Facebook was too valuable of a client to lose.

The report highlights how even Accenture has had doubts about the role it plays within Facebook's business. How to manage the toxic content that appears on its platform has been a hot-button topic, and critics have condemned Facebook for skirting that responsibility and instead outsourcing to firms like Accenture.  

Representatives for Facebook did not immediately respond to Insider's request for comment. Drew Pusateri, a Facebook spokesman, told the NYT that the company knows these "jobs can be difficult, which is why we work closely with our partners to constantly evaluate how to best support these teams."

An Accenture spokesperson told Insider that "content moderation is essential to protecting our society by keeping the internet safe—and it is even more critical as the pandemic has rapidly accelerated internet use."

The companies' relationship has been largely secretive since it began in 2010 and as the contractor has grown its workforce to meet Facebook's content moderation demands. The consulting firm created a team, code-named Honey Badger, specifically designed for that mission, according to the NYT.

But the partnership — worth $500 million in contracts with the Silicon Valley giant — is too valuable to end, insiders said. They told the NYT that Facebook is known as a "diamond client" within Accenture.

Read the full report on The New York Times here.