Content-Moderation intern als gefährlicher Beruf anerkannt, Arbeiter müssen PTSD-Klausel unterzeichnen

Casey Newton arbeitet sich auf The Verge bereits seit Jahren an den Arbeitsbedingungen der Content-Moderatoren ab. In Interviews fand er heraus, dass die mittlerweile an Unternehmensberatungen ausgelagerte Tätigkeit in unzähligen Fällen Mental Health-Issues erzeugte, die Moderatoren unter mutmaßlich illegalen Bedingungen arbeiten und diese psychologische Belastungen in der Vergangenheit praktisch ohne nennenswerte Begleitung durch Psychologen oder Mental Health-Maßnahmen ertragen mussten. Ein bisschen ist das so, als hätten die Zuckerbergs der Welt ihre neuen digitalen Städte (aka Plattformen) ohne Kanalisationssystem geplant und dann sie haben den Klempner outgesourced, wo er sich für Mindestlohn in seinen Überstunden Mental Health-Issues abholt.

Nun hat die Unternehmensberatung Accenture ein Dokument an ihre Youtube-Content-Moderatoren zur Unterzeichnung ausgegeben, mit dem die Mitarbeiter die Gefährlichkeit des Arbeitsplatzes und die Möglichkeit einer Erkrankung an einer Posttraumatischen Belastungsstörung (PTSD) anerkennen. Das Dokument wurde kurz nach der Veröffentlichung von Newtons Artikeln an die Moderatoren verteilt und darin wälzt das Unternehmen die Verantwortung für die geistige Gesundheit auf seine unterbezahlten Mitarbeiter ab, während man gleichzeitig die emotionale Belastung des Jobs formal anerkennt.

In einem Newsletter schreibt Newton nun davon, dass dieses Dokument nicht nur an Youtube-Moderatoren ausgegeben wurde, sondern auch an Moderatoren anderer Plattformen in den USA und Europa, offenbar bereitet sich Accenture mit dieser Vereinbarung auf kommende Gerichtsverhandlungen vor. Die ersten Moderatoren haben Klagen gegen ihre ehemaligen Arbeitgeber eingereicht. Als möglicherweise unbeabsichtiger Nebeneffekt gibt Accenture mit diesem Dokument praktisch zu, dass es sich bei der Moderation von Online-Content allgemein um einen gefährlichen Beruf handelt, der zu Einschränkungen der geistigen Gesundheit führen kann.

Ich erachte den Beruf des Content-Moderators als den wichtigsten Beruf unserer Zeit. Sie sind die Klempner, die dafür sorgen, dass das Internet zumindest oberflächlich nicht durch Gewalt und Porno überschwemmt wird. Genau wie die Erfindung der Abwassersysteme während der Industrialisierung Krankheiten eindämmte und zu einem gewaltigen gesellschaftlichen Sprung führte, sind es Systeme der Content-Moderation, die das Netz für uns alle ein wenig sicherer machen. Arbeitsplätze in diesem Bereich müssen massiv ausgebaut und besser bezahlt werden, während die Unternehmen die Sicherheit ihrer Arbeitnehmer gewährleisten und entsprechende Maßnahmen in ausreichender Menge zur Verfügung stellen.

The Verge: YOUTUBE MODERATORS ARE BEING FORCED TO SIGN A STATEMENT ACKNOWLEDGING THE JOB CAN GIVE THEM PTSD

“I understand the content I will be reviewing may be disturbing,” reads the document, which is titled “Acknowledgement” and was distributed to employees using DocuSign. “It is possible that reviewing such content may impact my mental health, and it could even lead to Post Traumatic Stress Disorder (PTSD). I will take full advantage of the weCare program and seek additional mental health services if needed. I will tell my supervisor/or my HR People Adviser if I believe that the work is negatively affecting my mental health.”

The PTSD statement comes at the end of the two-page acknowledgment form, and it is surrounded by a thick black border to signify its importance. It may be the most explicit acknowledgment yet from a content moderation company that the job now being done by tens of thousands of people around the world can come with severe mental health consequences.

“The wellbeing of our people is a top priority,” an Accenture spokeswoman said in an email. […]

The PTSD form describes various support services available to moderators who are suffering, including a “wellness coach,” a hotline, and the human resources department. (“The wellness coach is not a medical doctor and cannot diagnose or treat mental health disorders,” the document adds.)

It also seeks to make employees responsible for monitoring changes in their mental health and orders them to disclose negative changes to their supervisor or HR representative. It instructs employees to seek outside help if necessary as well. “I understand how important it is to monitor my own mental health, particularly since my psychological symptoms are primarily only apparent to me,” the document reads. “If I believe I may need any type of healthcare services beyond those provided by [Accenture], or if I am advised by a counselor to do so, I will seek them.”

The document adds that “no job is worth sacrificing my mental or emotional health” and that “this job is not for everyone” — language that suggests employees who experience mental health struggles as a result of their work do not belong at Accenture.

Aus Casey Newtons Newsletter: How tech companies should address their workers’ PTSD.

First, invest in research. We know that content moderation leads to PTSD, but we don’t know the frequency with which the condition occurs, or the roles most at risk for debilitating mental health issues. Nor have they investigated what level of exposure to disturbing content might be considered “safe.” It seems likely that those with sustained exposure to the most disturbing kind of photos and videos — violence and child exploitation — would be at the highest risk for PTSD. But companies ought to fund research into the issue and publish it. They’ve already confirmed that these jobs make the workforce ill — they owe it to their workforce to understand how and why that happens.

Second, properly disclose the risk. Whenever I speak to a content moderator, I ask what the recruiter told them about the job. The results are all over the map. Some recruiters are quite straightforward in their explanations of how difficult the work is. Others actively lie to their recruits, telling them that they’re going to be working on marketing or some other more benign job. It’s my view that PTSD risk should be disclosed to workers in the job description. Companies should also explore suggesting that these jobs are not suitable for workers with existing mental health conditions that could be exacerbated by the work. Taking the approach that Accenture has — asking workers to acknowledge the risk only after they start the job — strikes me as completely backwards.

Third, set a lifetime cap for exposure to disturbing content. Companies should limit the amount of disturbing content a worker can view during a career in content moderation, using research-based guides to dictate safe levels of exposure. Determining those levels is likely going to be difficult — but companies owe it to their workforces to try.

Fourth, develop true career paths for content moderators. If you’re a police officer, you can be promoted from beat cop to detective to police chief. But if you’re policing the internet, you might be surprised to learn that content moderation is often a dead-end career. Maybe you’ll be promoted to “subject matter expert” and be paid a dollar more an hour. But workers rarely make the leap to other jobs they might be qualified for — particularly staff jobs at Facebook, Google, and Twitter, where they could make valuable contributions in policy, content analysis, trust and safety, customer support, and more.

If content moderation felt like the entry point to a career rather than a cul-de-sac, it would be a much better bargain for workers putting their health on the line. And every tech company would benefit from having workers at every level who have spent time on the front lines of user-generated content.

Fifth, offer mental health support to workers after they leave the job. One reason content moderation jobs offer a bad bargain to workers is that you never know when PTSD might strike. I’ve met workers who first developed symptoms after a year, and others who had their first panic attacks during training. Naturally, these employees are among the most likely to leave their jobs — either because they found other work, or because their job performance suffered and they were fired. But their symptoms will persist indefinitely — in December I profiled a former Google moderator who still had panic attacks two years after quitting. Tech companies need to treat these workers like the US government treats veterans, and offer them free (or heavily subsidized) mental health care for some extended period after they leave the job.

Not all will need or take advantage of it. But by offering post-employment support, these companies will send a powerful signal that they take the health of all their employees seriously. And given that these companies only function — and make billions — on the backs of their outsourced content moderators, taking good care of them during and after their tours of duty strikes me as the very least that their employers can do.

Vorher auf Nerdcore:

Facebooks Inhalts-Moderatoren stecken sich mit Verschwörungstheorien an
Content Moderatoren und Community Manager sind die wichtigsten Berufe im Internet
Die psychologischen Folgen von Facebooks Content-Moderation

Jetzt Nerdcore unterstützen!

Nerdcore veröffentlicht seit mehr als 12 Jahren Analysen und Dokumentationen zu Memetik, Netz-Soziologie und digitalen Subkulturen, garniert mit jeder Menge Kunst, Entertainment und Unfug. Nerdcore prägte die deutsche Netzkultur maßgeblich, initiierte die erste deutsche Meme, ging Frau Merkel mit Flashmobs auf die Nerven und manche Menschen behaupten, ich würde ab und zu gute Arbeit abliefern.

Die Website ist seit 2017 werbefrei und wird aus Spenden und Abonnements finanziert. Um den Betrieb der Seite und meine Vollzeitstelle zu sichern, könnt ihr gerne ein Abonnement auf Patreon oder Steady abschließen oder mir eine einmalige Spende oder einen Dauerauftrag per Paypal oder auf mein Konto (IBAN DE05100100100921631121) zukommen lassen.

Vielen Dank an alle Leser und Unterstützer dieses Blogs.

Neu auf Nerdcore:

r/Nerdcore_de

James Brown x Motörhead

[Musikvideos] Haru Nemuri / Black Dresses / VTSS / Eyes Without A Face / Chromatics / Laima / Avril Spleen / All We Are / Kidbug / The Chats / The Weeknd

[Musikvideos] Bonnie ‘Prince’ Billy / Lightning Bug / Hayley Williams / Cam Cole / Porridge Radio / Ohmme / Swamp Dogg / Mav Karlo / Hazel English

Cartoon Skulls

[Musikvideos] Empress Of / Ed Schraders Music Beat / RMR / JARV IS… / Metal Preyers / Giraffage / EMIT / Capibara / Who Boy