Connect with us

SOCIAL

‘No job for humans’: the harrowing work of content moderators in Kenya

Published

on

Op-Ed: Facebook ‘dying’ yet again? Not really. It’s how you read the realities

Image: — © AFP Anatolii Stepanov

Simon VALMARY

Trevin Brownie’s first day as a content moderator for Facebook is etched in his memory, working out of a subcontractor’s nondescript office in the Kenyan capital Nairobi.

“My first video, it was a man committing suicide… there was a two- or three-year-old kid playing next to him. After the guy hanged himself, after about two minutes, the child notices something is wrong,” said the 30-year-old South African, recalling the youngster’s heartwrenching response.

“It made me sick… But I kept on working.”

Advertisement

For three years he watched hundreds of violent, hateful videos every day and removed them from Facebook.

Brownie and more than 180 of his former colleagues are now suing Meta, Facebook’s parent company, for the harm they suffered in the first major class action over content moderation since 2018.

He worked in Nairobi for Sama, a Californian company subcontracted by Meta to moderate Facebook content for sub-Saharan Africa between 2019 and 2023.

Sama has since announced it will be closing its content moderation hub in Nairobi, which employed people from a number of African countries recruited in particular for their knowledge of local languages.

Brownie said he watched all manner of horrors — “more than 100 beheadings”, “organs being ripped out of people”, “rapes and child pornography”, “child soldiers being prepared for war”.

“Humans do things to humans that I would never have even imagined. People have no idea of the sick videos that are posted, what they are escaping.”

Advertisement

– Legal battles –

Today, Brownie is involved in one of three cases against Meta in Kenya related to content moderation.

He and another 183 sacked Sama employees are contesting their “unlawful” dismissal and seeking compensation, saying their salaries failed to account for the risks they were exposed to and the damage to their mental health.

Up to 260 moderators are losing their jobs as a result of the Sama closure in Nairobi, according to the petition.

The legal offensive began with a lawsuit filed in May 2022 in a Nairobi court by a former content moderator, Daniel Motaung, complaining about poor working conditions, deceptive hiring methods, insufficient pay and a lack of mental health support.

Meta said it did not want to comment on the details of the cases but told AFP it demanded that its subcontractors made psychological support available 24/7.

Advertisement

Asked by AFP to respond to the claims, Sama said it was “not able to comment” on ongoing cases.

– ‘Downplayed the content’ –

Testimonies collected by AFP in April from several former Sama content moderators — who are among the plaintiffs in the dismissal case — support Motaung’s claims.

Two of them hired in 2019 by Sama, then called Samasource, said they had responded to offers to work in call centres passed on from acquaintances or recruitment centres.

They say they didn’t find out until they signed their contracts — which included confidentiality clauses — that they were going to work as content moderators.

Despite this, Amin and Tigist (whose names have been changed) did not question their new roles, or consider quitting.

Advertisement

“I had no idea of what a content moderator is, I had never heard about it,” said Tigist, an Ethiopian recruited for her knowledge of the Amharic language.

“Most of us had no knowledge of the difference between a call centre and a content moderation centre,” confirmed Amin, who worked in the Somali “market”.

But the next batch of recruits, he said, received offer letters clearly specifying it was a content moderation job.

On their first day of training, even before they were shown the images to be reviewed, the moderators were reminded they had signed non-disclosure agreements (NDAs).

“During the training, they downplayed the content, what we were going to see… What they showed us in training was nothing compared to what we were going to see,” said Amin.

Once they began work “the problems started”.

Advertisement

– ‘My heart became a stone’ –

Glued to their screens for eight hours a day, the moderators scrolled through hundreds of posts, each more shocking than the last.

“We don’t choose what to see, it just comes in randomly: suicide videos, graphic violence, child sexual exploitation, nudity, violent incitement… They flood into the system,” said Amin.

The moderators AFP spoke to claimed an “average handling time” of 55 to 65 seconds per video was imposed on them, or between 387 and 458 “tickets” viewed per day.

If they were too slow, they risked a warning, or even termination, they said.

Meta said in an email to AFP that content reviewers “are not required to evaluate any set number of posts, do not have quotas and aren’t pressured to make hasty decisions.

Advertisement

“We both allow and encourage the companies we work with to give their employees the time they need to make a determination when reviewing a piece of content,” it added.

None of the content moderators AFP spoke to imagined the adverse effects such work would have on them.

They say they have not consulted psychologists or psychiatrists, because of a lack of money, but recount symptoms of post-traumatic stress disorder.

Brownie said he is now “afraid of kids because of the child soldiers, the brutality I have seen children doing”.

He is also uncomfortable in crowded places “because of all the suicide videos I’ve seen”.

“I used to be a party freak… I haven’t been to a club for three years now. I can’t, I’m afraid.”

Advertisement

Amin said there have been physical effects too — his weight dropped from 96 kilos (212 pounds) when he started to around 70 kilos today.

The moderators say they have become numb to death or horror. “My heart… became a stone. I don’t feel anything,” said Tigist.

– ‘Needed the money’ –

Meta told AFP it has “clear contracts with each of our partners that detail our expectations in a number of areas, including availability of one-to-one counselling, extra support for those that are exposed to more challenging content”.

“We require all the companies we work with to provide 24/7 on-site support with trained practitioners, an on-call service and access to private healthcare from the first day of employment.”

But the content moderators claim the support offered by Sama through “wellness counsellors” was not up to par, with vague interviews, little follow-up and concerns about the confidentiality of their exchanges.

Advertisement

“The counselling sessions were not helpful at all. I don’t say they were not qualified, but I think they weren’t qualified enough to handle people doing content moderation,” said Amin.

Despite their traumas, those employed by Sama say they stayed on because they needed the money.

Paid 40,000 shillings ($285) a month — and another 20,000 shillings for non-Kenyans — their salary is more than double the minimum wage.

“From 2019 until today, I haven’t had the chance to get another job anywhere, even though I’ve tried applying a lot. I had no other option but to stay here and work, that’s why I stayed for so long,” said Amin.

– ‘Frontline of defence’ –

Brownie said the moderators turned to “coping mechanisms”, with some using drugs such as cannabis, according to those who spoke to AFP.

Advertisement

Once a fan of comedies, Brownie immersed himself in horror films, saying it was a way to blur reality.

“It made me try and imagine that what I was dealing with wasn’t real — although it is real,” he says, adding that he also developed an addiction to watching violent imagery.

“But one of the biggest coping mechanisms was that we are convinced that this job is so important.”

“I felt like I was beating myself up but for the right reasons… that the sacrifice was worth it for the good of the community.

“We are the frontline of defence for Facebook… like the police of social networking,” he says — pointing to work including stopping advertisements for illegal drugs and “removing targets” on people facing death threats or harassment.

“Without us, social networks cannot exist,” he adds. “Nobody is going to open Facebook when it’s just full of graphic content, selling narcotics, blackmail, harassment…”

Advertisement

– ‘We deserve better’ –

“It is damaging and we are sacrificing (ourselves) for our community and for the world… We deserve better treatment,” says Tigist.

None of them said they would sign up for the job again.

“My personal opinion is that no human should be doing this. This job is not for humans,” says Brownie, adding that he wished the task could be done by artificial intelligence.

For its part, Meta said: “Technology has and will continue to play a central role in our content enforcement operations.”

None of these content moderators have so far spoken about their work, even to their families — not only because of the NDAs but also because no one “can understand what we are going through”.

Advertisement

“For example, if people know that I’ve seen pornography, they will judge me,” says Tigist.

She has been vague with her husband about the work.

From her children, she concealed everything: “I don’t want them to know what I was doing. I don’t even want them to imagine what I’ve seen.”

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SOCIAL

Snapchat Explores New Messaging Retention Feature: A Game-Changer or Risky Move?

Published

on

By

Snapchat Explores New Messaging Retention Feature: A Game-Changer or Risky Move?

In a recent announcement, Snapchat revealed a groundbreaking update that challenges its traditional design ethos. The platform is experimenting with an option that allows users to defy the 24-hour auto-delete rule, a feature synonymous with Snapchat’s ephemeral messaging model.

The proposed change aims to introduce a “Never delete” option in messaging retention settings, aligning Snapchat more closely with conventional messaging apps. While this move may blur Snapchat’s distinctive selling point, Snap appears convinced of its necessity.

According to Snap, the decision stems from user feedback and a commitment to innovation based on user needs. The company aims to provide greater flexibility and control over conversations, catering to the preferences of its community.

Currently undergoing trials in select markets, the new feature empowers users to adjust retention settings on a conversation-by-conversation basis. Flexibility remains paramount, with participants able to modify settings within chats and receive in-chat notifications to ensure transparency.

Snapchat underscores that the default auto-delete feature will persist, reinforcing its design philosophy centered on ephemerality. However, with the app gaining traction as a primary messaging platform, the option offers users a means to preserve longer chat histories.

The update marks a pivotal moment for Snapchat, renowned for its disappearing message premise, especially popular among younger demographics. Retaining this focus has been pivotal to Snapchat’s identity, but the shift suggests a broader strategy aimed at diversifying its user base.

Advertisement

This strategy may appeal particularly to older demographics, potentially extending Snapchat’s relevance as users age. By emulating features of conventional messaging platforms, Snapchat seeks to enhance its appeal and broaden its reach.

Yet, the introduction of message retention poses questions about Snapchat’s uniqueness. While addressing user demands, the risk of diluting Snapchat’s distinctiveness looms large.

As Snapchat ventures into uncharted territory, the outcome of this experiment remains uncertain. Will message retention propel Snapchat to new heights, or will it compromise the platform’s uniqueness?

Only time will tell.

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SOCIAL

Catering to specific audience boosts your business, says accountant turned coach

Published

on

Catering to specific audience boosts your business, says accountant turned coach

While it is tempting to try to appeal to a broad audience, the founder of alcohol-free coaching service Just the Tonic, Sandra Parker, believes the best thing you can do for your business is focus on your niche. Here’s how she did just that.

When running a business, reaching out to as many clients as possible can be tempting. But it also risks making your marketing “too generic,” warns Sandra Parker, the founder of Just The Tonic Coaching.

“From the very start of my business, I knew exactly who I could help and who I couldn’t,” Parker told My Biggest Lessons.

Parker struggled with alcohol dependence as a young professional. Today, her business targets high-achieving individuals who face challenges similar to those she had early in her career.

“I understand their frustrations, I understand their fears, and I understand their coping mechanisms and the stories they’re telling themselves,” Parker said. “Because of that, I’m able to market very effectively, to speak in a language that they understand, and am able to reach them.” 

“I believe that it’s really important that you know exactly who your customer or your client is, and you target them, and you resist the temptation to make your marketing too generic to try and reach everyone,” she explained.

Advertisement



“If you speak specifically to your target clients, you will reach them, and I believe that’s the way that you’re going to be more successful.

Watch the video for more of Sandra Parker’s biggest lessons.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SOCIAL

Instagram Tests Live-Stream Games to Enhance Engagement

Published

on

Instagram Tests Live-Stream Games to Enhance Engagement

Instagram’s testing out some new options to help spice up your live-streams in the app, with some live broadcasters now able to select a game that they can play with viewers in-stream.

As you can see in these example screens, posted by Ahmed Ghanem, some creators now have the option to play either “This or That”, a question and answer prompt that you can share with your viewers, or “Trivia”, to generate more engagement within your IG live-streams.

That could be a simple way to spark more conversation and interaction, which could then lead into further engagement opportunities from your live audience.

Meta’s been exploring more ways to make live-streaming a bigger consideration for IG creators, with a view to live-streams potentially catching on with more users.

That includes the gradual expansion of its “Stars” live-stream donation program, giving more creators in more regions a means to accept donations from live-stream viewers, while back in December, Instagram also added some new options to make it easier to go live using third-party tools via desktop PCs.

Live streaming has been a major shift in China, where shopping live-streams, in particular, have led to massive opportunities for streaming platforms. They haven’t caught on in the same way in Western regions, but as TikTok and YouTube look to push live-stream adoption, there is still a chance that they will become a much bigger element in future.

Advertisement



Which is why IG is also trying to stay in touch, and add more ways for its creators to engage via streams. Live-stream games is another element within this, which could make this a better community-building, and potentially sales-driving option.

We’ve asked Instagram for more information on this test, and we’ll update this post if/when we hear back.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending

Follow by Email
RSS