137

More than 140 Facebook content moderators have been diagnosed with severe post-traumatic stress disorder caused by exposure to graphic social media content including murders, suicides, child sexual abuse and terrorism.

The moderators worked eight- to 10-hour days at a facility in Kenya for a company contracted by the social media firm and were found to have PTSD, generalised anxiety disorder (GAD) and major depressive disorder (MDD), by Dr Ian Kanyanya, the head of mental health services at Kenyatta National hospital in Nairobi.

The mass diagnoses have been made as part of lawsuit being brought against Facebook’s parent company, Meta, and Samasource Kenya, an outsourcing company that carried out content moderation for Meta using workers from across Africa.

The images and videos including necrophilia, bestiality and self-harm caused some moderators to faint, vomit, scream and run away from their desks, the filings allege.

top 7 comments
sorted by: hot top controversial new old
[-] thefartographer@lemm.ee 36 points 6 days ago

Companies keep talking about replacing employees with AI yet they keep up this fuckery. Y'all's AI models are either good enough to handle this shit or shouldn't be used as a bad-faith bargaining chip. If there were ever a job that should be eliminated from human labor, NSFL content moderating seems like the perfect contender.

[-] YarHarSuperstar@lemmy.world 9 points 6 days ago

I have heard that folks from African countries who are hired to train those AI models are also reporting abuses. So imo that's not really a solution either

[-] thefartographer@lemm.ee 1 points 6 days ago

Right, riiiiiight... I forgot about that part. Make AIs train each other. What could go wrong?!

[-] TrickDacy@lemmy.world 2 points 6 days ago

I'm pretty sure this is actually referring to work done by humans long before the "ai" fad

[-] thefartographer@lemm.ee 1 points 6 days ago

I think you're right. I thought this was a new story making the rounds

[-] digdilem@lemmy.ml 1 points 5 days ago

Rather a cycnical take here, but perhaps that's what's coming and these jobs are going to be made redundant shortly so they're filing a claim while they still can.

[-] ZeroHora@lemmy.ml 16 points 6 days ago* (last edited 6 days ago)

The mass diagnoses have been made as part of lawsuit being brought against Facebook’s parent company, Meta, and Samasource Kenya, an outsourcing company that carried out content moderation for Meta using workers from across Africa.

I tried to write different things about this but that shit speaks for itself, fuck this world.

this post was submitted on 18 Dec 2024
137 points (97.9% liked)

Technology

35124 readers
212 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS