10 Questions

Facebook, without the filter: three months as a contentmoderator

28ea6fdeda461216c621dd30b092f540 - Facebook, without the filter: three months as a contentmoderator

In Berlin and Essen, more than a thousand people as a Facebook Content Moderator. They filter the continuous stream of mud and violence. How? That is strictly confidential. For the first time gives a former employee of the us a look behind the computer screens. “It was as if I was in the pitch of the psychiatry worked.”

In July 2017 was a young German woman in Berlin getting started with the Facebook Community Operations team as a moderator, one of the thousands worldwide. Three months she worked there. In dS Weekblad, she tells her story – here’s a preview.

Like most social media platforms drives Facebook on users/community-regulatiesysteem: no shedding content out of the for he uploaded. Users have the ability to send messages that they are inappropriate, and to report. It is the job of the contentmoderator to those notifications – they are ‘tickets’ – processing.

My role describe is not simple. I Was a censor that free speech inperkte? I don’t think so. I respected the freedom to offend and to make use of the most far-reaching forms of expression. Often it was more about behavior than about language. Images and texts as the expression of an attitude rather than self-expression. There was a lot of violence and cruelty, ranging from haatboodschappen to sadism, from bullying to self-harm. Extremely violent content.

A part of the messages was also downright criminal. You are not trained and you come into contact with that kind of expression or behavior, then you can have only one thing to decide: we are completely crazy. I have things seen without the least respect for even one social norm, regurgitated in a world where any notion of intimacy and decency is gone.

Social media man unconsciously encouraged to be his last inhibitions to sail through all the social filters, and moral barriers to be lifted? What would happen if people in a similar way behaved in public space, on public transport, in cafés or in the park? I have me several times wondered.

Digital proletariat

The extent to which I could adapt to such a rigid, oppressive work environment? At first I saw it as a personal challenge, a test of my endurance. I was in a fabriekswereld, part of the global digital proletariat. A garrison of 700 people in a closed environment without contact with the outside world. Working hours and rest breaks were up on the second calculated, I sat glued to my computer, the production line just a few minutes left.

The contentmoderator, or agent, carries out a serious analysis. The task is difficult and context-sensitive. He has not only to decide whether the identified messages need to be removed, he must also navigate a highly complex hierarchy of actions. The mental operations are as complex as for algorithms. Nevertheless, moderators are expected to act as a computer. The penchant for uniformity and standardization leaves no room for human judgement and intuition.

The 1,300 reports per day

After his training is, should a moderator about 1,300 reports per day to handle him every few seconds to take a decision. The intellectually challenging task into an automated action, almost a reaction. The repetitive reinforces the frustration and the alienation.

Thinking is not encouraged, the employee can hardly take the initiative, she/he can only examples collect to loopholes in the policy to identify. The standardization is intended to contribute to the objectivity. The moderator should not assume, the intentions of the person behind the message is not in question. The abuse must be clear and unambiguous.

Saturday, read in dS Magazine the full story of the former Facebook-moderator. “I got videos of beheadings less because there is so much blood comes with it – and it is a quick death. I had to fight against nausea.’

Leave a Comment