Who moderates Facebook content and how much do they earn from it

Who moderates Facebook content and how much money they make from it?

Facebook pays one of the world’s largest outsourcing companies to moderate their content. Although Accenture earns perhaps $422 million annually from the deal, the company’s board of directors in 2019 was deeply considering further collaboration. The reason for top managers’ concerns was said to be concern for the mental health of their employees.

As reported by theHustle.What, since 2007 with the new regulation, Facebook has faced a challenge. New law mandated to remove pornographic content from the portal within 24 hours of publication. Zuckerberg’s giant quickly responded by delegating tasks to its subordinates. However, it quickly became apparent that these people are not able to moderate every piece of content that appears on the site. Faced with this situation, Facebook decided to hire an external company to handle the implementation of network support. The choice fell on Accenture Services.

What is Accenture?

As it writes about itself, on Accenture’s website we can read that it is a company that focuses on the development of artificial intelligence, in response to the expectations of its clients. Their goal is to create automating systems and processes to support employees and the business itself. Accenture helps every industry around the world and its core competencies are in business and strategy consulting, interactive media and marketing, cyber security and technology implementation, and business model transformation.

Accenture employees around the world share similar values: we take the impact of our work on our clients and the communities in which we work and live very seriously. We treat our work on a personal level” – says Julie Sweet, Chief Executive Officer of Accenture.

No wonder Facebook chose Accenture as its silent helper, but for the outsourcing company itself it was not without consequences.

Working on a personal level? Even very…

In the IrishTimes article.com reads that Accenture earns nearly $500 million through its partnership with Facebook. As noted, this is not the only cost incurred by the shareholder. TheHustle.Which pointed out that while 90% of harmful content is removed by artificial intelligence algorithms, that still leaves 10% of posts handled by humans. Moderators are able to review over 700 posts during one shift. The Verge did a report in 2019 highlighting what kind of content moderators face, and it even mentions murders and animal abuse. Violence, pornography, hate, and stupidity are more akin to browsing the darknet than the platform we all know, and that has a significant impact on the well-being of Accenture employees.

In the ChicagoTribune.com reads about the huge psychological cost and increase in depression and anxiety disorders in blue collar workers. Portal lists harrowing examples told by individual Accenture subordinates, and one man’s suicide attempt in Dublin. Izabela Dziugiel, who worked at the Warsaw branch, recalled that employees there were prepared to face the macabre. Her team was getting at least drastic pictures from the war in Syria. Another example is the person of Joshua Skalar of Austin, who had to face dead bodies, accidents, rapes or torture and abuse of animals.

Employees, know that you bear the “possible” risk!

After a high-profile 2019 lawsuit and last year’s $52 million settlement signed by Facebook in May with former content moderators, Accenture decided to include a two-page clause informing employees of possible mental health risks. The New York Times also writes about an October 2020 report that officially identified content moderation as a risk factor for the company. Such controversial content and its potential impact on moderators is attracting the attention of regulatory bodies and the media. There were a few changes in the Facebook contract, though unfortunately for employees, not everything was changed, as the partner proved too profitable to pass up.

Related Posts