“Child pornography, graphic violence … you have to be prepared to see anything”: the experience of a Facebook moderator

“Child pornography, graphic violence … you have to be prepared to see anything”: the experience of a Facebook moderator


“It’s mostly pornography,” says Sarah Katz, recalling the eight months she worked as a Facebook moderator.

“The agency was very direct about what kind of content we would see and how graphic it was, so we knew what we were facing,” she clarifies.

Katz refers to the agency of human moderators based in California, United States, that Facebook and other companies subcontracted. For her she worked in 2016.

Her task was to review complaints about inappropriate content that came from Facebook users.

And it was not easy, according to Emma Barnett, a BBC Radio 5 Live journalist.

“They gave us about a minute per publication to decide if it was spam and we had to eliminate it,” she begins to explain.

“Sometimes we also deleted the associated account,” she adds.

“Management liked that we did not work more than eight hours a day and that we reviewed an average of 8,000 publications per day, 1,000 per hour,” she continues.

She acknowledged that she learned a lot, but said that if she had to describe her work with a single word, it would be ” exhausting “.

Copyright of the image GETTY IMAGES
Image caption Facebook ensures that its reviewers play a crucial role in making the social network a safe and open environment. (Photo: AFP / Getty Images)

Illegal images

“You definitely have to be prepared to see anything with just a click, the images come quickly, without warning,” she says.

And the one that struck her most was a photograph that suggested child pornography.

” It was a boy and a girl. The child was about 12 years old and the girl about eight or nine , and stood facing each other , ” describes.

” They were not wearing pants and they were touching each other, it looked like an adult was telling them what to do, it was very disturbing, especially since it was real.”

Repeating publications

“Many of these explicit publications circulated continuously, we often saw them coming from six different accounts per day, so it was quite difficult to find the original source,” she recalls.

“At that time, there were no counseling services and psychological help, and today they may exist, I’m not sure.”

The ex-moderator recognizes that, had she offered it, she would have accepted the help.

“Definitely warn you, but being warned and seeing it are two different things,” she stresses.

Copyright of the image AFP
Image caption There are more than 7,000 people reviewing content on Facebook

“Some think they can handle it and it turns out they can not, because the reality is worse than what they expected.”

Violence phic

Katz recognizes that moderators often become “quite insensitive” over time. “I would not say it’s easier, but you definitely get used to it,” she says.

“Obviously, there was a lot of generic pornography among adults , which was not so disturbing,” she explains.

Some of these contents included animals. “There was an image with a horse that circulated continuously,” she recalls.

And she also encountered a lot of graphic violence. “I remember a publication in which a woman was bursting her head,” she says.

“Half of her body was on the floor and the other half, her torso, on a chair,” she describes.

The policy was stricter to eliminate pornography than for graphic violence.”

False news

“I think Facebook was caught by false news by surprise,” says Katz. “During the campaign for the US elections it was something that was off the radar, at least at the time I worked there.”

“I really do not remember ever hearing the term ‘false news’, ” she says.

“There was a lot of articles that were reported by users, but I do not remember that management asked us to verify that the facts that those texts collected were accurate,” is justified.

In general, the work was monotonous. “You really get used to identifying what is spam and what is not, it just becomes a lot of clicks.”

Asked if I would recommend it, it is blunt: ” If you can do anything else, I would say no .”

The response of Facebook

The BBC shared the story of Katz with Facebook.

In response, a spokesperson for the company created by Mark Zuckerberg said: ” Our reviewers play a crucial role in making Facebook a safe and open environment .”

“This can be a very challenging job, and we want to make sure they feel properly supported,” he added.

“That is why we offer regular training, counseling and psychological support to all our employees and all those who work for us through our partners,” he said.

“Although we use artificial intelligence when we can, there are now more than 7,000 people who review content on Facebook, and taking care of their well-being is a real priority for us .”

About author

Rava Desk

Rava is an online news portal providing recent news, editorials, opinions and advice on day to day happenings in Pakistan.


Leave a Reply

Your email address will not be published. Required fields are marked *

Your email address will not be published. Required fields are marked *