I watched a young woman dressed in a yellow top and denim shorts being dragged into a field by two young men who were stabbing her multiple times while attempting to use a blunt knife to chop her head off.
The screams screams slowly faded as she began to drown in her own blood and died.
When I was employed to do this job, I had no idea the level of violence that I would be exposed to. Nobody had even taken a minute to explain the horrific and gut-wrenching things I would see.
This is the testimony of a Facebook moderator who explains that she was never told that her work entailed sitting behind a computer, watching horror videos posted and deleting them from the social media giant, Meta’s systems for a “clean and safe” interaction.
Kiana Monique Arendse, a South African national came to Kenya in 2021 after receiving a job offer from an outsourcing company Samasource.
Today, she is before the Labour Court fighting for Samasource and Meta over the mass sacking. In her case, she claims that Meta has since dropped Samasource for Majorer and its moderators have since been issued with a redundancy notice.
In her affidavit, Kiana who is a mother of one claims that she was given a Sh60,000 basic salary and Sh20,000 an out-of-country allowance alongside health insurance offered by Samasource.
She says she took the offer to fend for her son. “I did not dare to ask any questions lest I lose an opportunity to financially provide for my child,” she narrates in her affidavit filed before Labour Court judge Nduma Nderi.
Kiana states that she came to learn of the horrors of being a moderator as she was being trained. According to her, she was expected to watch the first 15 seconds and the last 15 seconds of each video posted and decide whether the video was against Meta’s policies.
She was to decide whether the video remains or is taken down.
Stay informed. Subscribe to our newsletter
After being trained for three weeks, her work began. Her first encounter was with the girl who was being killed.
“The very first ticket that I encountered which I have not forgotten to this day… I was shaken, traumatised by this video that I immediately changed my ticket, stood up to call my father to try and calm me down,” she claims adding that trainers told them that they should detach their emotions while working.
She states that each day at work was a puzzle with no way of knowing whether the next thing coming up was either match fixing, vile beheading or mutilation of children.
She continues: “Even the tickets labelled as nudity many times turned to horrendous graphic violent content such as people being skinned alive, burnt alive, people being run over amongst other violent content without any kind of warning or preparation.”
According to her, the only way for a moderator to reduce the reality was to make the videos black and white. She alleges that they were not given counsellors.
“With every ticket we action, our mental health is irreversibly damaged. Sh60,000 is certainly not commensurate to the amount of mental wellness and psychosocial support we shall need to return to some kind of normalcy after years of doing this job,” Kiana says.
Mahlet Yilma Lemma is another moderator. She is from Ethiopia.
On her end, she narrates that in a day, she needed to take care of between 500 to 1,000 pieces of content per day. “I have witnessed the vilest things humanly possible. In addition to the nature of the job, the workplace is incredibly toxic,” she claims, adding that the top bosses have good perks while moderators break their backs and minds to keep Meta running.
Yilma says she suffers from insomnia.
Fasica Berhane, also an Ethiopia narrates that she too was tasked with tempering Facebook content. She claims that her daily routine at SamaSource was to watch graphic videos back-to-back from the Ethio-Tigray war which broke out in November 2021.
“I remember my first experience witnessing a slaughter on a live video in which I unconsciously stood up and screamed. For a minute I almost forgot where I was and who I was. Everything went blank. Coworkers who saw my reaction reported it to the team leader on shift, after which I was told to go see a counsellor.”
The claims about working for Meta are contained in a new case.
She claims they sign a non-disclosure agreement that bar them from telling anyone about their work.
“This solitude has made me rather avoidant and made it easier to be alone which is extremely dangerous when one’s mental health is not intact. All of the above existing problems were made worse by the redundancy.”