Connect with us

SOCIAL

‘No job for humans’: the harrowing work of content moderators in Kenya

Published

on

Op-Ed: Facebook ‘dying’ yet again? Not really. It’s how you read the realities

Image: — © AFP Anatolii Stepanov

Simon VALMARY

Trevin Brownie’s first day as a content moderator for Facebook is etched in his memory, working out of a subcontractor’s nondescript office in the Kenyan capital Nairobi.

“My first video, it was a man committing suicide… there was a two- or three-year-old kid playing next to him. After the guy hanged himself, after about two minutes, the child notices something is wrong,” said the 30-year-old South African, recalling the youngster’s heartwrenching response.

“It made me sick… But I kept on working.”

For three years he watched hundreds of violent, hateful videos every day and removed them from Facebook.

Brownie and more than 180 of his former colleagues are now suing Meta, Facebook’s parent company, for the harm they suffered in the first major class action over content moderation since 2018.

He worked in Nairobi for Sama, a Californian company subcontracted by Meta to moderate Facebook content for sub-Saharan Africa between 2019 and 2023.

Sama has since announced it will be closing its content moderation hub in Nairobi, which employed people from a number of African countries recruited in particular for their knowledge of local languages.

Brownie said he watched all manner of horrors — “more than 100 beheadings”, “organs being ripped out of people”, “rapes and child pornography”, “child soldiers being prepared for war”.

“Humans do things to humans that I would never have even imagined. People have no idea of the sick videos that are posted, what they are escaping.”

– Legal battles –

Today, Brownie is involved in one of three cases against Meta in Kenya related to content moderation.

He and another 183 sacked Sama employees are contesting their “unlawful” dismissal and seeking compensation, saying their salaries failed to account for the risks they were exposed to and the damage to their mental health.

Up to 260 moderators are losing their jobs as a result of the Sama closure in Nairobi, according to the petition.

The legal offensive began with a lawsuit filed in May 2022 in a Nairobi court by a former content moderator, Daniel Motaung, complaining about poor working conditions, deceptive hiring methods, insufficient pay and a lack of mental health support.

Meta said it did not want to comment on the details of the cases but told AFP it demanded that its subcontractors made psychological support available 24/7.

Asked by AFP to respond to the claims, Sama said it was “not able to comment” on ongoing cases.

– ‘Downplayed the content’ –

Testimonies collected by AFP in April from several former Sama content moderators — who are among the plaintiffs in the dismissal case — support Motaung’s claims.

Two of them hired in 2019 by Sama, then called Samasource, said they had responded to offers to work in call centres passed on from acquaintances or recruitment centres.

They say they didn’t find out until they signed their contracts — which included confidentiality clauses — that they were going to work as content moderators.

Despite this, Amin and Tigist (whose names have been changed) did not question their new roles, or consider quitting.

“I had no idea of what a content moderator is, I had never heard about it,” said Tigist, an Ethiopian recruited for her knowledge of the Amharic language.

“Most of us had no knowledge of the difference between a call centre and a content moderation centre,” confirmed Amin, who worked in the Somali “market”.

But the next batch of recruits, he said, received offer letters clearly specifying it was a content moderation job.

On their first day of training, even before they were shown the images to be reviewed, the moderators were reminded they had signed non-disclosure agreements (NDAs).

“During the training, they downplayed the content, what we were going to see… What they showed us in training was nothing compared to what we were going to see,” said Amin.

Once they began work “the problems started”.

– ‘My heart became a stone’ –

Glued to their screens for eight hours a day, the moderators scrolled through hundreds of posts, each more shocking than the last.

“We don’t choose what to see, it just comes in randomly: suicide videos, graphic violence, child sexual exploitation, nudity, violent incitement… They flood into the system,” said Amin.

The moderators AFP spoke to claimed an “average handling time” of 55 to 65 seconds per video was imposed on them, or between 387 and 458 “tickets” viewed per day.

If they were too slow, they risked a warning, or even termination, they said.

Meta said in an email to AFP that content reviewers “are not required to evaluate any set number of posts, do not have quotas and aren’t pressured to make hasty decisions.

“We both allow and encourage the companies we work with to give their employees the time they need to make a determination when reviewing a piece of content,” it added.

None of the content moderators AFP spoke to imagined the adverse effects such work would have on them.

They say they have not consulted psychologists or psychiatrists, because of a lack of money, but recount symptoms of post-traumatic stress disorder.

Brownie said he is now “afraid of kids because of the child soldiers, the brutality I have seen children doing”.

He is also uncomfortable in crowded places “because of all the suicide videos I’ve seen”.

“I used to be a party freak… I haven’t been to a club for three years now. I can’t, I’m afraid.”

Amin said there have been physical effects too — his weight dropped from 96 kilos (212 pounds) when he started to around 70 kilos today.

The moderators say they have become numb to death or horror. “My heart… became a stone. I don’t feel anything,” said Tigist.

– ‘Needed the money’ –

Meta told AFP it has “clear contracts with each of our partners that detail our expectations in a number of areas, including availability of one-to-one counselling, extra support for those that are exposed to more challenging content”.

“We require all the companies we work with to provide 24/7 on-site support with trained practitioners, an on-call service and access to private healthcare from the first day of employment.”

But the content moderators claim the support offered by Sama through “wellness counsellors” was not up to par, with vague interviews, little follow-up and concerns about the confidentiality of their exchanges.

“The counselling sessions were not helpful at all. I don’t say they were not qualified, but I think they weren’t qualified enough to handle people doing content moderation,” said Amin.

Despite their traumas, those employed by Sama say they stayed on because they needed the money.

Paid 40,000 shillings ($285) a month — and another 20,000 shillings for non-Kenyans — their salary is more than double the minimum wage.

“From 2019 until today, I haven’t had the chance to get another job anywhere, even though I’ve tried applying a lot. I had no other option but to stay here and work, that’s why I stayed for so long,” said Amin.

– ‘Frontline of defence’ –

Brownie said the moderators turned to “coping mechanisms”, with some using drugs such as cannabis, according to those who spoke to AFP.

Once a fan of comedies, Brownie immersed himself in horror films, saying it was a way to blur reality.

“It made me try and imagine that what I was dealing with wasn’t real — although it is real,” he says, adding that he also developed an addiction to watching violent imagery.

“But one of the biggest coping mechanisms was that we are convinced that this job is so important.”

“I felt like I was beating myself up but for the right reasons… that the sacrifice was worth it for the good of the community.

“We are the frontline of defence for Facebook… like the police of social networking,” he says — pointing to work including stopping advertisements for illegal drugs and “removing targets” on people facing death threats or harassment.

“Without us, social networks cannot exist,” he adds. “Nobody is going to open Facebook when it’s just full of graphic content, selling narcotics, blackmail, harassment…”

– ‘We deserve better’ –

“It is damaging and we are sacrificing (ourselves) for our community and for the world… We deserve better treatment,” says Tigist.

None of them said they would sign up for the job again.

“My personal opinion is that no human should be doing this. This job is not for humans,” says Brownie, adding that he wished the task could be done by artificial intelligence.

For its part, Meta said: “Technology has and will continue to play a central role in our content enforcement operations.”

None of these content moderators have so far spoken about their work, even to their families — not only because of the NDAs but also because no one “can understand what we are going through”.

“For example, if people know that I’ve seen pornography, they will judge me,” says Tigist.

She has been vague with her husband about the work.

From her children, she concealed everything: “I don’t want them to know what I was doing. I don’t even want them to imagine what I’ve seen.”

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SOCIAL

US Judge Blocks Montana’s Effort to Ban TikTok

Published

on

U.S. Judge Blocks Montana’s Effort to Ban TikTok in the State

TikTok has won another reprieve in the U.S., with a district judge blocking Montana’s effort to ban the app for all users in the state.

Back in May, Montana Governor Greg Gianforte signed legislation to ban TikTok outright from operating in the state, in order to protect residents from alleged intelligence gathering by China. There’s no definitive evidence that TikTok is, or has participated in such, but Gianforte opted to move to a full ban, going further than the government device bans issued in other regions.

As explained by Gianforte at the time:

The Chinese Communist Party using TikTok to spy on Americans, violate their privacy, and collect their personal, private, and sensitive information is well-documented. Today, Montana takes the most decisive action of any state to protect Montanans’ private data and sensitive personal information from being harvested by the Chinese Communist Party.”

In response, a collection of TikTok users challenged the proposed ban, arguing that it violated their first amendment rights, which led to this latest court challenge, and District Court Judge Donald Molloy’s decision to stop Montana’s ban effort.

Montana’s TikTok ban had been set to go into effect on Jan. 1, 2024.

In issuing a preliminary injunction to stop Montana from imposing a full ban on the app, Molloy said that Montana’s legislation does indeed violate the Constitution and “oversteps state power.”

Molloy’s judgment is primarily centered on the fact that Montana has essentially sought to exercise foreign policy authority in enacting a TikTok ban, which is only enforceable by federal authorities. Molloy also noted that there was apervasive undertone of anti-Chinese sentiment” within Montana’s proposed legislation.

TikTok has welcomed the ruling, issuing a brief statement in response:

Montana attorney general, meanwhile, has said that it’s considering next steps to advance its proposed TikTok ban.

The news is a win for TikTok, though the Biden Administration is still weighing a full TikTok ban in the U.S., which may still happen, even though the process has been delayed by legal and legislative challenges.

As I’ve noted previously, my sense here would be that TikTok won’t be banned in the U.S. unless there’s a significant shift in U.S.-China relations, and that relationship is always somewhat tense, and volatile to a degree.

If the U.S. government has new reason to be concerned, it may well move to ban the app. But doing so would be a significant step, and would prompt further response from the C.C.P.

Which is why I suspect that the U.S. government won’t act, unless it feels that it has to. And right now, there’s no clear impetus to implement a ban, and stop a Chinese-owned company from operating in the region, purely because of its origin.

Which is the real crux of the issue here. A TikTok ban is not just banning a social media company, it’s blocking cross-border commerce, because the company is owned by China, which will remain the logic unless clear evidence arises that TikTok has been used as a vector for gathering information on U.S. citizens.

Banning a Chinese-owned app because it is Chinese-owned is a statement, beyond concerns about a social app, and the U.S. is right to tread carefully in considering how such a move might impact other industries.

So right now, TikTok is not going to be banned, in Montana, or anywhere else in the U.S. But that could still change, very quickly.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SOCIAL

EU wants to know how Meta tackles child sex abuse

Published

on

The investigation is the first step in procedures launched under the EU's new online content law known as the Digital Services Act

The investigation is the first step in procedures launched under the EU’s new online content law known as the Digital Services Act – Copyright AFP Kirill KUDRYAVTSEV

The EU on Friday demanded Instagram-owner Meta provide more information about measures taken by the company to address child sexual abuse online.

The request for information focuses on Meta’s risk assessment and mitigation measures “linked to the protection of minors, including regarding the circulation of self-generated child sexual abuse material (SG-CSAM) on Instagram”, the European Commission said.

Meta must also give information about “Instagram’s recommender system and amplification of potentially harmful content”, it added.

The investigation is the first step in procedures launched under the EU’s Digital Services Act (DSA), but does not itself constitute an indication of legal violations or a move towards punishment.

Meta must respond by December 22.

A report by Stanford University and the Wall Street Journal in June this year said Instagram is the main platform used by paedophile networks to promote and sell content showing child sexual abuse.

Meta at the time said it worked “aggressively” to fight child exploitation.

The commission has already started a series of investigations against large digital platforms seeking information about how they are complying with the DSA.

It has sought more information from Meta in October about the spread of disinformation as well as a request for information last month about how the company protects children online.

The DSA is part of the European Union’s powerful regulatory armoury to bring big tech to heel, and requires digital giants take more aggressive action to counter the spread of illegal and harmful content as well as disinformation.

Platforms face fines that can go up to six percent of global turnover for violations.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SOCIAL

The North Face Delivered Jacket Via Helicopter After Viral TikTok Complaint

Published

on

The North Face Delivered Jacket Via Helicopter After Viral TikTok Complaint

  • Popular apparel brand The North Face posted a crazy marketing stunt on TikTok recently. 
  • In a video, they delivered a rain jacket to a woman at the top of a mountain in New Zealand via helicopter. 
  • The woman had complained in a viral TikTok that her waterproof jacket got soaked in the rain. 

The North Face pulled an elaborate marketing stunt on TikTok and delivered some rain gear via helicopter to a woman in New Zealand, whose complaint about the brand went viral on the platform. 

Jenn Jensen posted a TikTok video on November 17 showing herself on a hiking trail in the rain where she’s soaked whilst wearing a rain jacket sporting The North Face logo. 

“I’ve got a bone to pick with North Face,” Jensen says in the video which has racked up over 11 million views. “I bought this ‘rain jacket’ a couple of days ago and the tag for the advertising said that it’s waterproof. Well listen, I’m 100% sure that it’s raining outside and I’m soaking wet.” 

She added: “Listen… I don’t want a refund. I want you to redesign this rain coat to make it waterproof and express deliver it to the top of Hooker Valley Lake in New Zealand where I will be waiting.” 

She tagged The North Face’s TikTok page in her caption. In one comment a user named @timbrodini wrote: “*Northface has left the conversation.” 

The popular outdoor clothes brand made their own TikTok video in response to @timbrodini’s comment in which they said: “We were busy express delivering @Jenn her jacket at the top of mountain.”

In the TikTok video, a North Face employee can be seen grabbing a red jacket from one of its physical stores and then hopping onto a helicopter where he’s flown out to New Zealand. The man then jumps out of the helicopter at the top of the mountain and runs out to throw the jacket to Jensen who is waiting. 

She says “thank you” at the end of the video, which has also gone viral and gained 4.1 million views. 

Jensen then made a follow up video on her page explaining that The North Face’s marketing team saw her video and wanted to make “amends.” She said they flew her out by helicopter to the top of a mountain in New Zealand to give her new rain gear. 

“At this point the ultimate test will be if the new rain gear they gave me at the top of that mountain will hold up to the very high bar that North Face has now set for themselves,” she concluded at the end of the video. 

Some users speculated whether her original video was also a part of the marketing stunt but Jensen responded that she “turned down” the opportunity to be paid for the company’s follow up video. 

“I’m not an influencer, I was just a disappointed customer.” 

The marketing strategy appears to be a new way for brands to connect with customers by showing their care whilst also providing an entertaining video on social media. 

The North Face seems to be following the steps of the Stanley cup brand which recently went viral after gifting a woman a new car. The woman’s own car had burnt down, but in a TikTok video she showed that her insulated Stanley cup had survived the car fire and that the ice inside hadn’t even melted. 



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending