Leaked memo excoriates Facebook’s ‘slapdash and haphazard’ response to global political manipulation


A former Facebook data scientist dropped a detailed, damning memo on her last day there, calling the social network out for what she describes as an arbitrary, slow, and generally inadequate response to fake accounts and activity affecting politics worldwide.
BuzzFeed News acquired the full memo and has published excerpts in this report, which is well worth reading in its entirety.
Zhang was reportedly fired earlier in September for, as she describes it, ongoing disagreement with management about the company’s priorities and response to widespread manipulation.
In the 6,600-word memo, Zhang describes a system where the focus is very much on ordinary spam — which is of course a major problem for the platform — while “coordinated inauthentic behavior” (CIB) attempting to influence elections is not awarded as much priority or resources. Unless it’s politically expedient, for example if a botnet needs to be rolled up before testimony in Congress or pressure from the press.
As the memo reported by BuzzFeed News reads:
It’s an open secret within the civic integrity space that Facebook’s short-term decisions are largely motivated by PR and the potential for negative attention… It’s why I’ve seen priorities of escalations shoot up when others start threatening to go to the press, and why I was informed by a leader in my organization that my civic work was not impactful under the rationale that if the problems were meaningful they would have attracted attention, became a press fire, and convinced the company to devote more attention to the space.
Overall, the focus of my organization – and most of Facebook – was on large-scale problems, an approach which fixated us on spam. The civic aspect was discounted because of its small volume, its disproportionate impact ignored.
I’ve asked Facebook for comment on the memo, including the following specific claims reportedly made by Zhang:
- Large scale political manipulation returned in Honduras weeks after Facebook made attempts to stop it
- Her report of coordinated manipulation campaigns in Azerbaijan was not investigated for a year afterwards
- More than 10 million fake reactions and accounts were removed from the US and Brazil 2018 elections and never disclosed
- A major political influence campaign in Delhi, India this February was never reported
- Some 672,000 accounts were removed this spring from COVID-related misinformation campaigns in Spain and the U.S., also never disclosed
- Whether to pursue a misinformation campaign at all is often left to mid-level employees like Zhang, who claimed she had “no oversight whatsoever”
- Zhang’s push to dedicate more resources to civic platform problems led to her dismissal
Guy Rosen, Facebook’s VP of Integrity, attempted to play down the memo in a tweet, saying that Zhang was describing “fake likes”: “Like any team in the industry or government, we prioritize stopping the most urgent and harmful threats globally. Fake likes is not one of them,” he wrote. Certainly some of what Zhang describes is fake engagement, but far from all of it, and at any rate Facebook’s judgment in assigning priority is part of what the memo takes issue with.
The memo states outright what many have suspected is the case all along: That Facebook “projects an image of strength and competence… but the reality is that many of our actions are slapdash and haphazard accidents.” Not only that, but that the picture of Facebook’s efforts to combat this sort of thing is highly tailored by the company itself and not, it seems, in any way a complete or accurate one.
This post will be updated if we receive any substantial comment from Facebook. (It has been updated with Rosen’s tweet.)
Kenya labor court rules that Facebook can be sued

NAIROBI, Kenya (AP) — A judge in Kenya has ruled that Facebook’s parent company, Meta, can be sued in the East African country.
Meta tried to have the case dropped, arguing that Kenyan courts do not have jurisdiction over their operations, but the labor court judge dismissed that in a ruling on Monday.
A former Facebook moderator in Kenya, Daniel Motaung, is suing the company claiming poor working conditions.
Motaung said that while working as a moderator he was exposed to gruesome content such as rape, torture and beheadings that risked his and colleagues’ mental health.
He said Meta did not offer mental health support to employees, required unreasonably long working hours, and offered minimal pay. Motaung worked in Facebook’s African hub in Kenya’s capital, Nairobi, which is operated by Samasource Ltd.
Following the judge’s decision that Meta can be sued in Kenya, the next step in case will be considered by the court on Mar. 8.
Meta is facing a separate court case in which two Ethiopians say hate speech was allowed and even promoted on Facebook amid heated rhetoric over their country’s deadly Tigray conflict.
That lawsuit alleges that Meta hasn’t hired enough content moderators to adequately monitor posts, that it uses an algorithm that prioritizes hateful content, and that it responds more slowly to crises in Africa than elsewhere in the world.
The Associated Press and more than a dozen other media outlets last year reported that Facebook had failed to quickly and effectively moderate hate speech in several places around the world, including in Ethiopia. The reports were based on internal Facebook documents leaked by former employee and whistleblower Frances Haugen.
Mayor Woodards to Present 2023 State of the City Address

|
This year’s theme is “Building Tomorrow Together.”
|
Fonts
400: Missing font family
The requested font families are not available.
Requested:Symbol (style:normal,weight:400)
For reference,see the Google Fonts API documentation.