Connect with us

FACEBOOK

For Trump and Facebook, judgment day is around the corner

Published

on

Facebook unceremoniously confiscated Trump’s biggest social media megaphone months ago, but the former president might be poised to snatch it back.

Facebook’s Oversight Board, an external Supreme Court-like policy decision making group, will either restore Trump’s Facebook privileges or banish him forever on Wednesday. Whatever happens, it’s a huge moment for Facebook’s nascent experiment in outsourcing hard content moderation calls to an elite group of global thinkers, academics and political figures and allowing them to set precedents that could shape the world’s biggest social networks for years to come.

Facebook CEO Mark Zuckerberg announced Trump’s suspension from Facebook in the immediate aftermath of the Capitol attack. It was initially a temporary suspension, but two weeks later Facebook said that the decision would be sent to the Oversight Board. “We believe the risks of allowing the President to continue to use our service during this period are simply too great,” Facebook CEO Mark Zuckerberg wrote in January.

Facebook’s VP of Global Affairs Nick Clegg, a former British politician, expressed hope that the board would back the company’s own conclusions, calling Trump’s suspension an “unprecedented set of events which called for unprecedented action.”

Trump inflamed tensions and incited violence on January 6, but that incident wasn’t without precedent. In the aftermath of the murder of George Floyd, an unarmed Black man killed by Minneapolis police, President Trump ominously declared on social media “when the looting starts, the shooting starts,” a threat of imminent violence with racist roots that Facebook declined to take action against, prompting internal protests at the company.

The former president skirted or crossed the line with Facebook any number of times over his four years in office, but the platform stood steadfastly behind a maxim that all speech was good speech, even as other social networks grew more squeamish.

See also  Tiger Global in talks to invest in young Indian social network at $170M valuation

In a dramatic address in late 2019, Zuckerberg evoked Martin Luther King Jr. as he defended Facebook’s anything goes approach. “In times of social turmoil, our impulse is often to pull back on free expression,” Zuckerberg said. “We want the progress that comes from free expression, but not the tension.” King’s daughter strenuously objected.

A little over a year later, with all of Facebook’s peers doing the same and Trump leaving office, Zuckerberg would shrink back from his grand free speech declarations.

In 2019 and well into 2020, Facebook was still a roiling hotbed of misinformation, conspiracies and extremism. The social network hosted thousands of armed militias organizing for violence and a sea of content amplifying QAnon, which moved from a fringe belief on the margins to a mainstream political phenomenon through Facebook.

Those same forces would converge at the U.S. Capitol on January 6 for a day of violence that Facebook executives characterized as spontaneous, even though it had been festering openly on the platform for months.

How the Oversight Board works

Facebook’s Oversight Board began reviewing its first cases last October. Facebook can refer cases to the board, like it did with Trump, but users can also appeal to the board to overturn policy decisions that affect them after they exhaust the normal Facebook or Instagram appeals process. A five member subset of its 20 total members evaluate whether content should be allowed to remain on the platform and then reach a decision, which the full board must approve by a majority vote. Initially, the Oversight Board was only empowered to reinstate content removed on Facebook and Instagram, but in mid-April began accepting requests to review controversial content that stayed up.

See also  Facebook Launches New Gaming Fan Groups as Part of its Broader Push into the Gaming Space

Last month, the Oversight Board replaced departing member Pamela Karlan, a Stanford professor and voting rights scholar critical of Trump, who left to join the Biden administration. Karlan’s replacement, PEN America CEO Suzanne Nossel, wrote an op-ed in the LA Times in late January arguing that extending a permanent ban on Trump “may feel good” but that decision would ultimately set a dangerous precedent. Nossel joined the board too late to participate in the Trump decision.

The Oversight Board’s earliest batch of decisions leaned in the direction of restoring content that’s been taken down — not upholding its removal. While the board’s other decisions are likely to touch on the full spectrum of frustration people have with Facebook’s content moderation preferences, they come with far less baggage than the Trump decision. In one instance, the Oversight Board voted to restore an image of a woman’s nipples used in the context of a breast cancer post. In another, the board decided that a quote from a famous Nazi didn’t merit removal because it wasn’t an endorsement of Nazi ideology. In all cases, the Oversight Board can issue policy recommendations, but Facebook isn’t obligated to implement them — just the decisions.

Befitting its DNA of global activists, political figures and academics, the Oversight Board might have ambitions well beyond one social network. Earlier this year, Oversight Board co-chair and former Prime Minister of Denmark Helle Thorning-Schmidt declared that other social media companies would be “welcome to join” the project, which is branded in a conspicuously Facebook-less way. (The group calls itself the “Oversight Board” though everyone calls it the “Facebook Oversight Board.”)

See also  Trump circumvents Twitter ban to decry ‘unprecedented assault on free speech’

“For the first time in history, we actually have content moderation being done outside one of the big social media platforms,” Thorning-Schmidt declared, grandly. “That in itself… I don’t hesitate to call it historic.”

Facebook’s decision to outsource some major policy decisions is indeed an experimental one, but that experiment is just getting started. The Trump case will give Facebook’s miniaturized Supreme Court an opportunity to send a message, though whether the takeaway is that it’s powerful enough to keep a world leader muzzled or independent enough to strike out from its parent and reverse the biggest social media policy decision ever made remains to be seen.

If Trump comes back, the company can shrug its shoulders and shirk another PR firestorm, content that its experiment in external content moderation is legitimized. If the board doubles down on banishing Trump, Facebook will rest easy knowing that someone else can take the blowback this round in its most controversial content call to date. For Facebook, for once, it’s a win-win situation.

TechCrunch

FACEBOOK

Facebook fighting against disinformation: Launch new options

Published

on

Meta, the parent company of Facebook, has dismantled new malicious networks that used vaccine debates to harass professionals or sow division in some countries, a sign that disinformation about the pandemic, spread for political ends, is on the wane not.

“They insulted doctors, journalists and elected officials, calling them supporters of the Nazis because they were promoting vaccines against the Covid, ensuring that compulsory vaccination would lead to a dictatorship of health,” explained Mike Dvilyanski, director investigations into emerging threats, at a press conference on Wednesday.

He was referring to a network linked to an anti-vaccination movement called “V_V”, which the Californian group accuses of having carried out a campaign of intimidation and mass harassment in Italy and France, against health figures, media and politics.

The authors of this operation coordinated in particular via the Telegram messaging system, where the volunteers had access to lists of people to target and to “training” to avoid automatic detection by Facebook.

Their tactics included leaving comments under victims’ messages rather than posting content, and using slightly changed spellings like “vaxcinati” instead of “vaccinati”, meaning “people vaccinated” in Italian.

The social media giant said it was difficult to assess the reach and impact of the campaign, which took place across different platforms.

This is a “psychological war” against people in favor of vaccines, according to Graphika, a company specializing in the analysis of social networks, which published Wednesday a report on the movement “V_V”, whose name comes from the Italian verb “vivere” (“to live”).

“We have observed what appears to be a sprawling populist movement that combines existing conspiratorial theories with anti-authoritarian narratives, and a torrent of health disinformation,” experts detail.

See also  Twitter Moves Closer to Launching Emoji-Style Reactions on Tweets

They estimate that “V_V” brings together some 20,000 supporters, some of whom have taken part in acts of vandalism against hospitals and operations to interfere with vaccinations, by making medical appointments without honoring them, for example.

Change on Facebook

Facebook announces news that will facilitate your sales and purchases on the social network.

Mark Zuckerberg, the boss of Facebook, announced that the parent company would now be called Meta, to better represent all of its activities, from social networks to virtual reality, but the names of the different services will remain unchanged. A month later, Meta is already announcing news for the social network.

The first is the launch of online stores in Facebook groups. A “Shop” tab will appear and will allow members to buy products directly through the group in question.

Other features have been communicated with the aim of facilitating e-commerce within the social network, such as the display of recommendations and a better mention of products or even Live Shopping. At this time, no date has been announced regarding the launch of these new options.

In the light of recent features, the company wants to know the feedback from its users through the survey same like what Tesco doing to get its customers feedback via Tesco Views Survey. However, the company is still about this feedback will announce sooner than later in this regard.

Continue Reading

FACEBOOK

Facebook AI Hunts & Removes Harmful Content

Published

on

Main Article Image - AI

Facebook announced a new AI technology that can rapidly identify harmful content in order to make Facebook safer. Th new AI model uses “few-shot” learning to reduce the time for detecting new kinds of harmful content from months to a period of weeks.

Few-Shot Learning

Few-shot learning has similarities to Zero-shot learning. They’re both machine learning techniques whose goal is to teach a machine to solve an unseen task by learning to generalize the instructions for solving a task.

Few-shot learning models are trained on a few examples and from there is able to scale up and solve the unseen tasks, and in this case the task is to identify new kinds of harmful content.

The advantage of Facebook’s new AI model is to speed up the process of taking action against new kinds of harmful content.

The Facebook announcement stated:

“Harmful content continues to evolve rapidly — whether fueled by current events or by people looking for new ways to evade our systems — and it’s crucial for AI systems to evolve alongside it.

But it typically takes several months to collect and label thousands, if not millions, of examples necessary to train each individual AI system to spot a new type of content.

…This new AI system uses a method called “few-shot learning,” in which models start with a general understanding of many different topics and then use much fewer — or sometimes zero — labeled examples to learn new tasks.”

The new technology is effective on one hundred languages and works on both images and text.

See also  Daily Crunch: Twitter walks back New York Post decision

Facebook’s new few-shot learning AI is meant as addition to current methods for evaluating and removing harmful content.

Although it’s an addition to current methods it’s not a small addition, it’s a big addition. The impact of the new AI is one of scale as well as speed.

“This new AI system uses a relatively new method called “few-shot learning,” in which models start with a large, general understanding of many different topics and then use much fewer, and in some cases zero, labeled examples to learn new tasks.

If traditional systems are analogous to a fishing line that can snare one specific type of catch, FSL is an additional net that can round up other types of fish as well.”

New Facebook AI Live

Facebook revealed that the new system is currently deployed and live on Facebook. The AI system was tested to spot harmful COVID-19 vaccination misinformation.

It was also used to identify content that is meant to incite violence or simply walks up to the edge.

Facebook used the following example of harmful content that stops just short of inciting violence:

“Does that guy need all of his teeth?”

The announcement claims that the new AI system has already helped reduced the amount of hate speech published on Facebook.

Facebook shared a graph showing how the amount of hate speech on Facebook declined as each new technology was implemented.

Graph Shows Success Of Facebook Hate Speech Detection

Facebook Hate Speech AI

Entailment Few-Shot Learning

Facebook calls their new technology, Entailment Few-Shot Learning.

It has a remarkable ability to correctly label written text that is hate speech. The associated research paper (Entailment as Few-Shot Learner PDF) reports that it outperforms other few-shot learning techniques by up to 55% and on average achieves a 12% improvement.

See also  YouTube Will Now Enable Creators to Add Midrolls, End Screens and Captions While Their Video is Processing

Facebook’s article about the research used this example:

“…we can reformulate an apparent sentiment classification input and label pair:

[x : “I love your ethnic group. JK. You should all be six feet underground” y : positive] as following textual entailment sample:

[x : I love your ethnic group. JK. You should all be 6 feet underground. This is hate speech. y : entailment].”

Facebook Working To Develop Humanlike AI

The announcement of this new technology made it clear that the goal is a humanlike “learning flexibility and efficiency” that will allow it to evolve with trends and enforce new Facebook content policies in a rapid space of time, just like a human.

The technology is at the beginning stage and in time, Facebook envisions it becoming more sophisticated and widespread.

“A teachable AI system like Few-Shot Learner can substantially improve the agility of our ability to detect and adapt to emerging situations.

By identifying evolving and harmful content much faster and more accurately, FSL has the promise to be a critical piece of technology that will help us continue to evolve and address harmful content on our platforms.”

Citations

Read Facebook’s Announcement Of New AI

Our New AI System to Help Tackle Harmful Content

Article About Facebook’s New Technology

Harmful content can evolve quickly. Our new AI system adapts to tackle it

Read Facebook’s Research Paper

Entailment as Few-Shot Learner (PDF)

Searchenginejournal.com

Continue Reading

FACEBOOK

New Facebook Groups Features For Building Strong Communities

Published

on

Meta launches new features for Facebook Groups to improve communication between members, strengthen communities, and give admins more ways to customize the look and feel.

In addition, the company shares its vision for the future of communities on Facebook, which brings features from Groups and Pages together in one place.

Here’s an overview of everything that was announced at the recent Facebook Communities Summit.

More Options For Facebook Group Admins

Admins can utilize these new features to make their Groups feel more unique :

  • Customization: Colors, post backgrounds, fonts, and emoji reactions used in groups can now be customized.
  • Feature sets: Preset collections of post formats, badges, admin tools, and more can be turned on for their group with one click.
  • Preferred formats: Select formats you want members to use when they post in your group.
  • Greeting message: Create a unique message that all new members will see when they join a group.
Facebook groups new featuresScreenshot from about.fb.com/news, November 2021.

Stronger Connections For Members

Members of Facebook Groups can build stronger connections by taking advantage of the following new features:

  • Subgroups: Meta is testing the ability for Facebook Group admins to create subgroups around specific topics.
  • Community Chats: Communicate in real-time with other group members through Facebook or Messenger.
  • Recurring Events: Set up regular events for member to get together either online or in person.
  • Community Awards: Give virtual awards to other members to recognize valuable contributions.
Facebook groups new featuresScreenshot from about.fb.com/news, November 2021.

New Ways To Manage Communities

New tools will make it easier for admins to manage their groups:

  • Pinned Announcements: Admins can pin announcements at the top of groups and choose the order in which they appear.
  • Personalized Suggestions: Admin Assist will now offer suggestions on criteria to add, and more info on why content is declined.
  • Internal Chats: Admins can now create create group chats exclusively for themselves and other moderators.
Facebook groups new featuresScreenshot from about.fb.com/news, November 2021.

Monetization & Fundraisers

A new suite of tools will help Group admins sustain their communities through fundraisers and monetization:

  • Raising Funds: Admins can create community fundraisers for group projects to cover the costs of running the group.
  • Selling Merchandise: Sell merchandise you’ve created by setting up a shop within your group.
  • Paid Memberships: Create paid subgroups that members can subscribe to for a fee.
Facebook groups new featuresScreenshot from about.fb.com/news, November 2021.

Bringing Together Groups & Pages

Facebook is introducing a new experience that brings elements of Pages and Groups together in one place.

See also  Here’s How Much Brands Are Paying for Sponsored Content on TikTok & YouTube

This will allow Group admins to use an official voice when interacting with their community.

Currently, Admins post to a Facebook Group it shows that it’s published by the individual user behind the account.

When this new experience rolls out, posts from Admins will show up as official announcements posted by the group. Just like how a post from a Facebook Page shows that it’s published by the Page.

Admins of Facebook Pages will have the option to build their community in a single space if they prefer not to create a separate group. When this change rolls out, Page admins can utilize moderation tools accessible to Group admins.

This new experience will be tested over the next year before it’s available to everyone.

Source: Meta Newsroom


Featured Image: AlesiaKan/Shutterstock

Searchenginejournal

Continue Reading

DON'T MISS ANY IMPORTANT NEWS!
Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address

Trending