Reducing the spread of misinformation on social media: What would a do-over look like?

The news is awash with stories of platforms clamping down on misinformation and the angst involved in banning prominent members. But these are Band-Aids over a deeper issue — namely, that the problem of misinformation is one of our own design. Some of the core elements of how we’ve built social media platforms may inadvertently increase polarization and spread misinformation.
If we could teleport back in time to relaunch social media platforms like Facebook, Twitter and TikTok with the goal of minimizing the spread of misinformation and conspiracy theories from the outset … what would they look like?
This is not an academic exercise. Understanding these root causes can help us develop better prevention measures for current and future platforms.
Some of the core elements of how we’ve built social media platforms may inadvertently increase polarization and spread misinformation.
As one of the Valley’s leading behavioral science firms, we’ve helped brands like Google, Lyft and others understand human decision-making as it relates to product design. We recently collaborated with TikTok to design a new series of prompts (launched this week) to help stop the spread of potential misinformation on its platform.

Image Credits: Irrational Labs (opens in a new window)
The intervention successfully reduces shares of flagged content by 24%. While TikTok is unique amongst platforms, the lessons we learned there have helped shape ideas on what a social media redux could look like.
Create opt-outs
We can take much bigger swings at reducing the views of unsubstantiated content than labels or prompts.
In the experiment we launched together with TikTok, people saw an average of 1.5 flagged videos over a two-week period. Yet in our qualitative research, many users said they were on TikTok for fun; they didn’t want to see any flagged videos whatsoever. In a recent earnings call, Mark Zuckerberg also spoke of Facebook users’ tiring of hyperpartisan content.
We suggest giving people an “opt-out of flagged content” option — remove this content from their feeds entirely. To make this a true choice, this opt-out needs to be prominent, not buried somewhere users must seek it out. We suggest putting it directly in the sign-up flow for new users and adding an in-app prompt for existing users.
Shift the business model
There’s a reason false news spreads six times faster on social media than real news: Information that’s controversial, dramatic or polarizing is far more likely to grab our attention. And when algorithms are designed to maximize engagement and time spent on an app, this kind of content is heavily favored over more thoughtful, deliberative content.
The ad-based business model is at the core the problem; it’s why making progress on misinformation and polarization is so hard. One internal Facebook team tasked with looking into the issue found that, “our algorithms exploit the human brain’s attraction to divisiveness.” But the project and proposed work to address the issues was nixed by senior executives.
Essentially, this is a classic incentives problem. If business metrics that define “success” are no longer dependent on maximizing engagement/time on site, everything will change. Polarizing content will no longer need to be favored and more thoughtful discourse will be able to rise to the surface.
Design for connection
A primary part of the spread of misinformation is feeling marginalized and alone. Humans are fundamentally social creatures who look to be part of an in-group, and partisan groups frequently provide that sense of acceptance and validation.
We must therefore make it easier for people to find their authentic tribes and communities in other ways (versus those that bond over conspiracy theories).
Mark Zuckerberg says his ultimate goal with Facebook was to connect people. To be fair, in many ways Facebook has done that, at least on a surface level. But we should go deeper. Here are some ways:
We can design for more active one-on-one communication, which has been shown to increase well-being. We can also nudge offline connection. Imagine two friends are chatting on Facebook messenger or via comments on a post. How about a prompt to meet in person, when they live in the same city (post-COVID, of course)? Or if they’re not in the same city, a nudge to hop on a call or video.
In the scenario where they’re not friends and the interaction is more contentious, platforms can play a role in highlighting not only the humanity of the other person, but things one shares in common with the other. Imagine a prompt that showed, as you’re “shouting” online with someone, everything you have in common with that person.
Platforms should also disallow anonymous accounts, or at minimum encourage the use of real names. Clubhouse has good norm-setting on this: In the onboarding flow they say, “We use real names here.” Connection is based on the idea that we’re interacting with a real human. Anonymity obfuscates that.
Finally, help people reset
We should make it easy for people to get out of an algorithmic rabbit hole. YouTube has been under fire for its rabbit holes, but all social media platforms have this challenge. Once you click a video, you’re shown videos like it. This may help sometimes (getting to that perfect “how to” video sometimes requires a search), but for misinformation, this is a death march. One video on flat earth leads to another, as well as other conspiracy theories. We need to help people eject from their algorithmic destiny.
With great power comes great responsibility
More and more people now get their news from social media, and those who do are less likely to be correctly informed about important issues. It’s likely that this trend of relying on social media as an information source will continue.
Social media companies are thus in a unique position of power and have a responsibility to think deeply about the role they play in reducing the spread of misinformation. They should absolutely continue to experiment and run tests with research-informed solutions, as we did together with the TikTok team.
This work isn’t easy. We knew that going in, but we have an even deeper appreciation for this fact after working with the TikTok team. There are many smart, well-intentioned people who want to solve for the greater good. We’re deeply hopeful about our collective opportunity here to think bigger and more creatively about how to reduce misinformation, inspire connection and strengthen our collective humanity all at the same time.
Top CIA agent shared pro-Palestinian to Facebook after Hamas attack: report

A high-ranking CIA official boldly shared multiple pro-Palestinian images on her Facebook page just two weeks after Hamas launched its bloody surprise attack on Israel — while President Biden was touring the Jewish state to pledge the US’s allegiance to the nation.
The CIA’s associate deputy director for analysis changed her cover photo on Oct. 21 to a shot of a man wearing a Palestinian flag around his neck and waving a larger flag, the Financial Times reported.
The image — taken in 2015 during a surge in the long-stemming conflict — has been used in various news stories and pieces criticizing Israel’s role in the violence.
The CIA agent also shared a selfie with a superimposed “Free Palestine” sticker, similar to those being plastered on businesses and public spaces across the nation by protesters calling for a cease-fire.
The Financial Times did not name the official after the intelligence agency expressed concern for her safety.
“The officer is a career analyst with extensive background in all aspects of the Middle East and this post [of the Palestinian flag] was not intended to express a position on the conflict,” a person familiar with the situation told the outlet.
The individual added that the sticker image was initially posted years before the most recent crisis between the two nations and emphasized that the CIA official’s Facebook account was also peppered with posts taking a stand against antisemitism.
The latest post of the man waving the flag, however, was shared as Biden shook hands with Israeli leaders on their own soil in a show of support for the Jewish state in its conflict with the terrorist group.
Biden has staunchly voiced support for the US ally since the Oct. 7 surprise attack that killed more than 1,300 people, making the CIA agent’s posts in dissent an unusual move.
In her role, the associate deputy director is one of three people, including the deputy CIA director, responsible for approving all analyses disseminated inside the agency.
She had also previously overseen the production of the President’s Daily Brief, the highly classified compilation of intelligence that is presented to the president most days, the Financial Times said.
“CIA officers are committed to analytic objectivity, which is at the core of what we do as an agency. CIA officers may have personal views, but this does not lessen their — or CIA’s — commitment to unbiased analysis,” the CIA said in a statement to the outlet.
Follow along with The Post’s live blog for the latest on Hamas’ attack on Israel
Neither the Office of the Director of National Intelligence nor the White House responded to The Post’s request for comment.
All of the official’s pro-Palestinian images and other, unrelated posts have since been deleted, the outlet reported.
The report comes as CIA Director William Burns arrived in Qatar, where he was due to meet with his Israeli and Egyptian counterparts and the Gulf state’s prime minister to discuss the possibility of extending the pause in fighting between Israeli forces and Hamas terrorists in the Gaza Strip for a second time.
Israel and Hamas agreed Monday to an additional two-day pause in fighting, meaning combat would likely resume Thursday morning Israel time if no additional halt is brokered.
Both sides agreed to release a portion of its hostages under the arrangement.
More than 14,000 Palestinians in Gaza, including many women and children, have been killed in the conflict, according to data from the Hamas-controlled Ministry of Health.
Lee Hsien Yang faces damages for defamation against two Singapore ministers over Ridout Road rentals

SINGAPORE — The High Court in Singapore has directed Lee Hsien Yang to pay damages to ministers K. Shanmugam and Vivian Balakrishnan for defamatory statements made in Facebook comments regarding their rental of black-and-white bungalows on Ridout Road.
The court issued a default judgment favouring the two ministers after Lee – the youngest son of Singapore’s founding prime minister Lee Kuan Yew and brother of current Prime Minister Lee Hsien Loong – failed to address the defamation lawsuits brought against him. Lee had, among other claims, insinuated that the ministers engaged in corrupt practices and received preferential treatment from the Singapore Land Authority for their bungalow rentals.
The exact amount of damages will be evaluated in a subsequent hearing.
Restricted from spreading defamatory claims against ministers
Not only did Justice Goh Yi Han grant the default judgment on 2 November, but he also imposed an injunction to prohibit Lee from further circulating false and defamatory allegations.
In a released written judgment on Monday (27 November), the judge highlighted “strong reasons” to believe that Lee might persist in making defamatory statements again, noting his refusal to remove the contentious Facebook post on 23 July, despite receiving a letter of demand from the ministers on 27 July.
Among other things, Lee stated in the post that “two ministers have leased state-owned mansions from the agency that one of them controls, felling trees and getting state-sponsored renovations.”
A report released by the Corrupt Practices Investigation Bureau in June concluded that no wrongdoing or preferential treatment had occurred concerning the two ministers. However, Lee continued referencing this post and the ongoing lawsuits, drawing attention to his remarks under legal scrutiny.
Justice Goh emphasised that the ministers met the prerequisites for a default judgment against Lee. The suits, separately filed by Shanmugam, the Law and Home Affairs Minister, and Dr Balakrishnan, the Foreign Affairs Minister, were initiated in early August.


He failed to respond within 21 days
Lee and his wife, Lee Suet Fern, had left Singapore in July 2022, after declining to attend a police interview for potentially giving false evidence in judicial proceedings over the late Lee Kuan Yew’s will.
His absence from Singapore prompted the court to permit Shanmugam and Dr Balakrishnan to serve him legal documents via Facebook Messenger in mid-September. Despite no requirement for proof that Lee saw these documents, his subsequent social media post on 16 September confirmed his awareness of the served legal papers.
Although Lee had the opportunity to respond within 21 days, he chose not to do so. Additionally, the judge noted the novelty of the ministers’ request for an injunction during this legal process, highlighting updated court rules allowing such measures since April 2022.
Justice Goh clarified that despite the claimants’ application for an injunction, the court needed independent validation for its appropriateness, considering its potentially severe impact on the defendant. He reiterated being satisfied with the circumstances and granted the injunction, given the continued accessibility of the contentious Facebook post.
Lee acknowledges court order and removes allegations from Facebook
Following the court’s decision, Lee acknowledged the court order on 10 November and removed the statements in question from his Facebook page.
In the judgment, Justice Goh noted that there were substantial grounds to anticipate Lee’s repetition of the “defamatory allegations by continuing to draw attention to them and/or publish further defamatory allegations against the claimants.”
The judge mentioned that if Lee had contested the ministers’ claims, there could have been grounds for a legally enforceable case under defamation law.
According to Justice Goh, a reasonable reader would interpret Lee’s Facebook post as insinuating that the People’s Action Party’s trust had been squandered due to the ministers’ alleged corrupt conduct, from which they gained personally.
While Shanmugam and Dr Balakrishnan were not explicitly named, the post made it evident that it referred to them, and these posts remained accessible to the public, as noted by the judge.
Justice Goh pointed out that by choosing not to respond to the lawsuits, Lee prevented the court from considering any opposing evidence related to the claims.
Do you have a story tip? Email: [email protected].
You can also follow us on Facebook, Instagram, TikTok and Twitter. Also check out our Southeast Asia, Food, and Gaming channels on YouTube.
Tauranga judge orders Team Chopper Facebook pages taken down due to ‘threatening’ online communciations

Helen Fraser’s son Ryan Tarawhiti-Brown with Chopper, the dog at the centre of an attack on Tauranga vet Dr Liza Schneider.
The son of the woman whose Rottweiler dog attacked and seriously injured a Tauranga vet has been ordered to disable two Facebook pages that contained threats towards the vet and her business.
Ryan Tarawhiti-Brown (AKA Ryan Brown) ran and promoted a Facebook page called Team Chopper in support of his mother Helen Fraser’s legal battle to save her dog Chopper.
Chopper was euthanised following a court order handed down on August 21 by Judge David Cameron after he convicted Fraser of being the owner of a dog that attacked and seriously injured Holistic Vets co-owner Dr Liza Schneider.
The attack happened in the carpark of her Fraser St practice on October 14, 2022.
Advertisement
Schneider was left with serious injuries after Chopper bit her arm, including a broken bone in her forearm, and deep tissue damage and nerve damage.
She required surgery and her arm took several months to heal.
Following Fraser’s conviction, Schneider sought a takedown order after she told the court she and her practice had been the subject of constant online harassment and threats since October 2021.
Schneider said comments posted on the Team Chopper Facebook page included threats, harassment and derogatory and abusive comments.
Advertisement
In an affidavit, Schneider said her Google account had also been bombarded with fake reviews which she alleged were incited by the Team Chopper page.
Court documents obtained by the Bay of Plenty Times confirm an interim judgment was made by Judge Lance Rowe on August 30 which ordered the page be taken down and any references to Schneider removed. She also asked for a written apology. This order was previously suppressed.
During a second court hearing on October 25, Tarawhiti-Brown’s lawyer Bev Edwards told Judge Cameron it was accepted her client had not complied with this order to take down the page.
Edwards said her client had instead changed the nature of the page to help promote the rights of cats and dogs, and no criticism or abuse of Schneider or Holistic Vets was made by her client in those posts.
Tarawhiti-Brown had filed an affidavit to similar effect, court documents show.
Schneider argued the change in tone had not prevented others from posting derogatory comments about her.
This included posts on September 23, which stated she should be “prosecuted for negligence”, “sucked” at her job and should lose her licence.
Edwards also submitted that Schneider was prepared to use social media to her own advantage when it suited, her and cited an online article published in June.
In Judge Cameron’s written judgement, dated November 13, Tarawhiti-Brown, who lives in Australia, was ordered to immediately disable or take down his two Facebook pages.
The judge ruled the digital communications on the Facebook pages had been “threatening” to Schneider and “amount to harassment of her”, and also caused her “ongoing psychological harm”.
Advertisement
Judge Cameron also ordered Tarawhiti-Brown to refrain from making any digital communications about Schneider or identifying her or her business directly or indirectly, and not to encourage any other person to do so.
The judge said it was accepted by Schneider removal orders against Facebook/Meta were “fraught with difficulties”, including jurisdictional ones, and discontinued the takedown application against those organisations.
The judge did not order Tarawhiti-Brown to apologise to Schneider and lifted the suppression orders by consent of both parties, who had to pay their own legal costs.
Schneider and the NZ Veterinary Association, which has been supporting her, declined to comment on these court orders.
Tarawhiti-Brown was also approached for comment.
Sandra Conchie is a senior journalist at the Bay of Plenty Times and Rotorua Daily Post who has been a journalist for 24 years. She mainly covers police, court and other justice stories, as well as general news. She has been a Canon Media Awards regional/community reporter of the year.
Advertisement
-
FACEBOOK6 days ago
Indian Government Warns Facebook, YouTube About Deepfakes, Misinformation Violations
-
MARKETING5 days ago
Whiteboard Friday Recap 2023: AI Edition
-
SOCIAL7 days ago
Meta Stock: Still Room For Upside In A Maturing Market (NASDAQ:META)
-
SOCIAL7 days ago
Instagram Will Now Enable All Users to Download Publicly Posted Reels Clips
-
MARKETING7 days ago
OpenAI: The return of the king
-
MARKETING6 days ago
Making the Most of Electronic Resumes (Pro Tips and Tricks)
-
SEARCHENGINES5 days ago
No Estimate To Share For Completion Of Google November Core & Reviews Updates
-
SEARCHENGINES4 days ago
Google Merchant Center Automatically Creating Promotions