Facebook partially documents its content recommendation system


Algorithmic recommendation systems on social media sites like YouTube, Facebook and Twitter have shouldered much of the blame for the spread of misinformation, propaganda, hate speech, conspiracy theories and other harmful content. Facebook, in particular, has come under fire in recent days for allowing QAnon conspiracy groups to thrive on its platform and for helping militia groups to scale membership. Today, Facebook is attempting to combat claims that its recommendation systems are at any way at fault for how people are exposed to troubling, objectionable, dangerous, misleading and untruthful content.
The company has, for the first time, made public how its content recommendation guidelines work.
In new documentation available in Facebook’s Help Center and Instagram’s Help Center, the company details how Facebook and Instagram’s algorithms work to filter out content, accounts, Pages, Groups and Events from its recommendations.
Currently, Facebook’s Suggestions may appear as Pages You May Like, “Suggested For You” posts in News Feed, People You May Know, or Groups You Should Join. Instagram’s suggestions are found within Instagram Explore, Accounts You May Like and IGTV Discover.
The company says Facebook’s existing guidelines have been in place since 2016 under a strategy it references as “remove, reduce, and inform.” This strategy focuses on removing content that violates Facebook’s Community Standards, reducing the spread of problematic content that does not violate its standards, and informing people with additional information so they can choose what to click, read or share, Facebook explains.
The Recommendation Guidelines typically fall under Facebook’s efforts in the “reduce” area, and are designed to maintain a higher standard than Facebook’s Community Standards, because they push users to follow new accounts, groups, Pages and the like.
Facebook, in the new documentation, details five key categories that are not eligible for recommendations. Instagram’s guidelines are similar. However, the documentation offers no deep insight into how Facebook actually chooses what to recommend to a given user. That’s a key piece to understanding recommendation technology, and one Facebook intentionally left out.
One obvious category of content that many not be eligible for recommendation includes those that would impede Facebook’s “ability to foster a safe community,” such as content focused on self-harm, suicide, eating disorders, violence, sexually explicit content, regulated content like tobacco or drugs or content shared by non-recommendable accounts or entities.
Facebook also claims to not recommend sensitive or low-quality content, content users frequently say they dislike and content associated with low-quality publishings. These further categories include things like clickbait, deceptive business models, payday loans, products making exaggerated health claims or offering “miracle cures,” content promoting cosmetic procedures, contests, giveaways, engagement bait, unoriginal content stolen from another source, content from websites that get a disproportionate number of clicks from Facebook versus other places on the web or news that doesn’t include transparent information about the authorship or staff.
In addition, Facebook claims it won’t recommend fake or misleading content, like those making claims found false by independent fact checkers, vaccine-related misinformation and content promoting the use of fraudulent documents.
It says it will also “try” not to recommend accounts or entities that recently violated Community Standards, shared content Facebook tries to not recommend, posted vaccine-related misinformation, engaged in purchasing “Likes,” has been banned from running ads, posted false information or are associated with movements tied to violence.
The latter claim, of course, follows recent news that a Kenosha militia Facebook Event remained on the platform after being flagged 455 times after its creation, and had been cleared by four moderators as non-violating content. The associated Page had issued a “call to arms” and hosted comments about people asking what types of weapons to bring. Ultimately, two people were killed and a third was injured at protests in Kenosha, Wisconsin when a 17-year old armed with an AR-15-style rifle broke curfew, crossed state lines and shot at protestors.
Given Facebook’s track record, it’s worth considering how well Facebook is capable of abiding by its own stated guidelines. Plenty of people have found their way to what should be ineligible content, like conspiracy theories, dangerous health content, COVID-19 misinformation and more by clicking through on suggestions at times when the guidelines failed. QAnon grew through Facebook recommendations, it’s been reported.
It’s also worth noting, there are many gray areas that guidelines like these fail to cover.
Militia groups and conspiracy theories are only a couple of examples. Amid the pandemic, U.S. users who disagreed with government guidelines on business closures can easily find themselves pointed toward various “reopen” groups where members don’t just discuss politics, but openly brag about not wearing masks in public or even when required to do so at their workplace. They offer tips on how to get away with not wearing masks, and celebrate their successes with selfies. These groups may not technically break rules by their description alone, but encourage behavior that constitutes a threat to public health.
Meanwhile, even if Facebook doesn’t directly recommend a group, a quick search for a topic will direct you to what would otherwise be ineligible content within Facebook’s recommendation system.
For instance, a quick search for the word “vaccines” currently suggests a number of groups focused on vaccine injuries, alternative cures and general anti-vax content. These even outnumber the pro-vax content. At a time when the world’s scientists are trying to develop protection against the novel coronavirus in the form of a vaccine, allowing anti-vaxxers a massive public forum to spread their ideas is just one example of how Facebook is enabling the spread of ideas that may ultimately become a global public health threat.
The more complicated question, however, is where does Facebook draw the line in terms of policing users having these discussions versus favoring an environment that supports free speech? With few government regulations in place, Facebook ultimately gets to make this decision for itself.
Recommendations are only a part of Facebook’s overall engagement system, and one that’s often blamed for directing users to harmful content. But much of the harmful content that users find could be those groups and Pages that show up at the top of Facebook search results when users turn to Facebook for general information on a topic. Facebook’s search engine favors engagement and activity — like how many members a group has or how often users post — not how close its content aligns with accepted truths or medical guidelines.
Facebook’s search algorithms aren’t being similarly documented in as much detail.
Lee Hsien Yang faces damages for defamation against two Singapore ministers over Ridout Road rentals

SINGAPORE — The High Court in Singapore has directed Lee Hsien Yang to pay damages to ministers K. Shanmugam and Vivian Balakrishnan for defamatory statements made in Facebook comments regarding their rental of black-and-white bungalows on Ridout Road.
The court issued a default judgment favouring the two ministers after Lee – the youngest son of Singapore’s founding prime minister Lee Kuan Yew and brother of current Prime Minister Lee Hsien Loong – failed to address the defamation lawsuits brought against him. Lee had, among other claims, insinuated that the ministers engaged in corrupt practices and received preferential treatment from the Singapore Land Authority for their bungalow rentals.
The exact amount of damages will be evaluated in a subsequent hearing.
Restricted from spreading defamatory claims against ministers
Not only did Justice Goh Yi Han grant the default judgment on 2 November, but he also imposed an injunction to prohibit Lee from further circulating false and defamatory allegations.
In a released written judgment on Monday (27 November), the judge highlighted “strong reasons” to believe that Lee might persist in making defamatory statements again, noting his refusal to remove the contentious Facebook post on 23 July, despite receiving a letter of demand from the ministers on 27 July.
Among other things, Lee stated in the post that “two ministers have leased state-owned mansions from the agency that one of them controls, felling trees and getting state-sponsored renovations.”
A report released by the Corrupt Practices Investigation Bureau in June concluded that no wrongdoing or preferential treatment had occurred concerning the two ministers. However, Lee continued referencing this post and the ongoing lawsuits, drawing attention to his remarks under legal scrutiny.
Justice Goh emphasised that the ministers met the prerequisites for a default judgment against Lee. The suits, separately filed by Shanmugam, the Law and Home Affairs Minister, and Dr Balakrishnan, the Foreign Affairs Minister, were initiated in early August.


He failed to respond within 21 days
Lee and his wife, Lee Suet Fern, had left Singapore in July 2022, after declining to attend a police interview for potentially giving false evidence in judicial proceedings over the late Lee Kuan Yew’s will.
His absence from Singapore prompted the court to permit Shanmugam and Dr Balakrishnan to serve him legal documents via Facebook Messenger in mid-September. Despite no requirement for proof that Lee saw these documents, his subsequent social media post on 16 September confirmed his awareness of the served legal papers.
Although Lee had the opportunity to respond within 21 days, he chose not to do so. Additionally, the judge noted the novelty of the ministers’ request for an injunction during this legal process, highlighting updated court rules allowing such measures since April 2022.
Justice Goh clarified that despite the claimants’ application for an injunction, the court needed independent validation for its appropriateness, considering its potentially severe impact on the defendant. He reiterated being satisfied with the circumstances and granted the injunction, given the continued accessibility of the contentious Facebook post.
Lee acknowledges court order and removes allegations from Facebook
Following the court’s decision, Lee acknowledged the court order on 10 November and removed the statements in question from his Facebook page.
In the judgment, Justice Goh noted that there were substantial grounds to anticipate Lee’s repetition of the “defamatory allegations by continuing to draw attention to them and/or publish further defamatory allegations against the claimants.”
The judge mentioned that if Lee had contested the ministers’ claims, there could have been grounds for a legally enforceable case under defamation law.
According to Justice Goh, a reasonable reader would interpret Lee’s Facebook post as insinuating that the People’s Action Party’s trust had been squandered due to the ministers’ alleged corrupt conduct, from which they gained personally.
While Shanmugam and Dr Balakrishnan were not explicitly named, the post made it evident that it referred to them, and these posts remained accessible to the public, as noted by the judge.
Justice Goh pointed out that by choosing not to respond to the lawsuits, Lee prevented the court from considering any opposing evidence related to the claims.
Do you have a story tip? Email: [email protected].
You can also follow us on Facebook, Instagram, TikTok and Twitter. Also check out our Southeast Asia, Food, and Gaming channels on YouTube.
Tauranga judge orders Team Chopper Facebook pages taken down due to ‘threatening’ online communciations

Helen Fraser’s son Ryan Tarawhiti-Brown with Chopper, the dog at the centre of an attack on Tauranga vet Dr Liza Schneider.
The son of the woman whose Rottweiler dog attacked and seriously injured a Tauranga vet has been ordered to disable two Facebook pages that contained threats towards the vet and her business.
Ryan Tarawhiti-Brown (AKA Ryan Brown) ran and promoted a Facebook page called Team Chopper in support of his mother Helen Fraser’s legal battle to save her dog Chopper.
Chopper was euthanised following a court order handed down on August 21 by Judge David Cameron after he convicted Fraser of being the owner of a dog that attacked and seriously injured Holistic Vets co-owner Dr Liza Schneider.
The attack happened in the carpark of her Fraser St practice on October 14, 2022.
Advertisement
Schneider was left with serious injuries after Chopper bit her arm, including a broken bone in her forearm, and deep tissue damage and nerve damage.
She required surgery and her arm took several months to heal.
Following Fraser’s conviction, Schneider sought a takedown order after she told the court she and her practice had been the subject of constant online harassment and threats since October 2021.
Schneider said comments posted on the Team Chopper Facebook page included threats, harassment and derogatory and abusive comments.
Advertisement
In an affidavit, Schneider said her Google account had also been bombarded with fake reviews which she alleged were incited by the Team Chopper page.
Court documents obtained by the Bay of Plenty Times confirm an interim judgment was made by Judge Lance Rowe on August 30 which ordered the page be taken down and any references to Schneider removed. She also asked for a written apology. This order was previously suppressed.
During a second court hearing on October 25, Tarawhiti-Brown’s lawyer Bev Edwards told Judge Cameron it was accepted her client had not complied with this order to take down the page.
Edwards said her client had instead changed the nature of the page to help promote the rights of cats and dogs, and no criticism or abuse of Schneider or Holistic Vets was made by her client in those posts.
Tarawhiti-Brown had filed an affidavit to similar effect, court documents show.
Schneider argued the change in tone had not prevented others from posting derogatory comments about her.
This included posts on September 23, which stated she should be “prosecuted for negligence”, “sucked” at her job and should lose her licence.
Edwards also submitted that Schneider was prepared to use social media to her own advantage when it suited, her and cited an online article published in June.
In Judge Cameron’s written judgement, dated November 13, Tarawhiti-Brown, who lives in Australia, was ordered to immediately disable or take down his two Facebook pages.
The judge ruled the digital communications on the Facebook pages had been “threatening” to Schneider and “amount to harassment of her”, and also caused her “ongoing psychological harm”.
Advertisement
Judge Cameron also ordered Tarawhiti-Brown to refrain from making any digital communications about Schneider or identifying her or her business directly or indirectly, and not to encourage any other person to do so.
The judge said it was accepted by Schneider removal orders against Facebook/Meta were “fraught with difficulties”, including jurisdictional ones, and discontinued the takedown application against those organisations.
The judge did not order Tarawhiti-Brown to apologise to Schneider and lifted the suppression orders by consent of both parties, who had to pay their own legal costs.
Schneider and the NZ Veterinary Association, which has been supporting her, declined to comment on these court orders.
Tarawhiti-Brown was also approached for comment.
Sandra Conchie is a senior journalist at the Bay of Plenty Times and Rotorua Daily Post who has been a journalist for 24 years. She mainly covers police, court and other justice stories, as well as general news. She has been a Canon Media Awards regional/community reporter of the year.
Advertisement
Facebook group helps creative director uncover mystery behind photo of father

A creative director who “didn’t really have any memories” of his father who he died when he was just two was able to “build a persona” of him thanks to an old photo and a local Facebook group.
Lee Williamson’s father, Ian, died at the age of 23 in a car crash and, without many ways of finding out more about him due to personal reasons, he uploaded one of the few photos of the pair together, from 1983, to a High Wycombe Facebook group in the hope someone could help.
“I wasn’t quite sure where it was taken, but I just knew it was from High Wycombe somewhere, so one of my friends – who lives in High Wycombe – said to leave it with a couple of his friends but they weren’t sure,” the 42-year-old, who now lives in Lanesborough, Co Longford, Ireland, told the PA news agency.
“So, I posted it in a Facebook group and lots of people started interacting with it.
“I had no idea that was going to happen.”
Liz Parry, 62, who used to babysit Mr Williamson, happened across the post and from there, a phone call was set up between the pair on Friday.
Ms Parry, who now lives in Iver, Buckinghamshire, and is retired, told PA that as soon as she saw the picture, she recognised both Mr Williamson and his father as she lived next door to them on Hylton Road, High Wycombe, for about two to three years.
“I used to help the family out by babysitting them and we’d spend time together at games evenings or would have drinks together,” she said.
She said Mr Williamson’s late father was always on the lookout for a “good deal to give the family a good home”.
She added when Mr Williamson was a baby, he was “lovely”.
“He was always happy and laughing and smiling and wanting to play,” she added.
“He was a really happy little baby.”
She said the conversation with him on Friday was a “lovely” way to catch up after so many years.
“That little baby that I used to look after is now in his 40s and has his own children,” she said.
“It’s nice to see how well he is doing now as well because I wondered what happened to the family after they left (the area) after Ian’s passing.”
Mr Williamson said: “The chat on the phone was nice.
“I found out he was a bit of a Del Boy character, he was always looking for a way to make a business and was only 23 when he died.”
“I didn’t really know that he was my father until I was 10 – when I stumbled upon a bunch of documents in the attic while playing with a Scalectrix car toy – and so I didn’t really have any memory of him to be able to build a persona of who my father used to be.
“I have two of my own kids now, so can talk to them about stories about their grandad – it just gives you a sense of closure.”
Mr Williamson said it was “nice” to see Facebook lead to something “positive”.
“Everyone who replied was very encouraging and it showed that High Wycombe is a very nice place to be and the people that lived there had fond memories.”
The best videos delivered daily
Watch the stories that matter, right from your inbox
-
FACEBOOK4 days ago
Indian Government Warns Facebook, YouTube About Deepfakes, Misinformation Violations
-
MARKETING3 days ago
Whiteboard Friday Recap 2023: AI Edition
-
MARKETING6 days ago
“Undercover” Case Studies: Why the Future of Marketing Is Proving Yourself in the Field
-
SEARCHENGINES6 days ago
Follower Count Is Not A Google Search Ranking Factor
-
MARKETING7 days ago
Sam’s Club Member Access Platform (MAP) Advertiser’s Guide
-
SOCIAL5 days ago
17-Year-Old Claims To Make 6 Figures A Year
-
SOCIAL5 days ago
Meta Stock: Still Room For Upside In A Maturing Market (NASDAQ:META)
-
SOCIAL7 days ago
U.S. Senators Accuse X of Profiting From Terrorist Propaganda in the App