Connect with us

FACEBOOK

Facebook Targeting Individuals Who Spread False Information

Published

on

Facebook Targeting Individuals Who Spread False Information

Facebook announced a number of measures, including penalization, aimed at individuals who spread false information. This is an escalation of on the part of Facebook in its effort to reduce the amount of toxic content shared on the platform.

Facebook Targeting Individuals to Stop Spread of False Information

Past actions have focused on Facebook Pages and organizations that spread misinformation. This time Facebook is targeting individual members in an effort to educate them and if they fail to improve their behavior then Facebook will take actions to limit the reach of their posts.

The announcement defined its new actions as against individual members:

“Today, we’re launching new ways to inform people if they’re interacting with content that’s been rated by a fact-checker as well as taking stronger action against people who repeatedly share misinformation on Facebook.

Whether it’s false or misleading content about COVID-19 and vaccines, climate change, elections or other topics, we’re making sure fewer people see misinformation on our apps.”

The effort to limit misleading content is structured into three actions:

  1. Inform Members Before Interacting with Misleading Pages
  2. Penalize Members Who Share False Information
  3. Educate Individuals Who Have Shared Toxic Content

Inform Members Before Sharing

The process for stopping individuals from sharing toxic content begins before they like a page that is known to share misleading content.

This will help slow down the likes that a page acquires but more importantly it will help educate the individual about the toxic nature of the page they might consider liking.

Facebook will spawn a popup warning about the page and show a link that provides more information about their fact-checking program.

Screenshot of Popup Warning Against False Information Page

Screenshot of Facebook false information warning

Penalize Members Who Share False Information

The next step in the process for limiting the spread of false information is to penalize individuals who repeatedly share toxic content that has been rated as false by Facebook’s raters.

The penalty consists of limiting the reach of these members Facebook posts so that less people see them.

Advertisement

According to Facebook’s announcement:

“Starting today, we will reduce the distribution of all posts in News Feed from an individual’s Facebook account if they repeatedly share content that has been rated by one of our fact-checking partners.”

Educate Users of Past False Information they’ve Shared

The last part of this effort to limit the spread of false information is to educate members who have shared content that was subsequently rated as being false information.

The current practice is when the content is rated a notice is sent to the member alerting them of the status of the content they shared.

What Facebook has done is improved this notice by redesigning it to make it clearer so that the member has a better understanding of why the content is rated false and to warn them of future action against them in the form of penalties should they share more false information.

Facebook’s announcement explained it like this:

“We currently notify people when they share content that a fact-checker later rates, and now we’ve redesigned these notifications to make it easier to understand when this happens.

The notification includes the fact-checker’s article debunking the claim as well as a prompt to share the article with their followers.

It also includes a notice that people who repeatedly share false information may have their posts moved lower in News Feed so other people are less likely to see them.”

Advertisement

Reducing the Spread of Misleading Content

Facebook is documented to be well tuned to removing spammy, graphic violent content and adult content. Much of it was reported to be caught by Facebook’s automated systems.

Removing content at the level of the individual is a new front in an ongoing war against false information.

Citation

Taking Action Against People Who Repeatedly Share Misinformation

Searchenginejournal.com

See also  Google Ads Makes Parallel Tracking Mandatory for Video Campaigns

FACEBOOK

Facebook fighting against disinformation: Launch new options

Published

on

Meta, the parent company of Facebook, has dismantled new malicious networks that used vaccine debates to harass professionals or sow division in some countries, a sign that disinformation about the pandemic, spread for political ends, is on the wane not.

“They insulted doctors, journalists and elected officials, calling them supporters of the Nazis because they were promoting vaccines against the Covid, ensuring that compulsory vaccination would lead to a dictatorship of health,” explained Mike Dvilyanski, director investigations into emerging threats, at a press conference on Wednesday.

He was referring to a network linked to an anti-vaccination movement called “V_V”, which the Californian group accuses of having carried out a campaign of intimidation and mass harassment in Italy and France, against health figures, media and politics.

The authors of this operation coordinated in particular via the Telegram messaging system, where the volunteers had access to lists of people to target and to “training” to avoid automatic detection by Facebook.

Their tactics included leaving comments under victims’ messages rather than posting content, and using slightly changed spellings like “vaxcinati” instead of “vaccinati”, meaning “people vaccinated” in Italian.

The social media giant said it was difficult to assess the reach and impact of the campaign, which took place across different platforms.

This is a “psychological war” against people in favor of vaccines, according to Graphika, a company specializing in the analysis of social networks, which published Wednesday a report on the movement “V_V”, whose name comes from the Italian verb “vivere” (“to live”).

“We have observed what appears to be a sprawling populist movement that combines existing conspiratorial theories with anti-authoritarian narratives, and a torrent of health disinformation,” experts detail.

Advertisement

They estimate that “V_V” brings together some 20,000 supporters, some of whom have taken part in acts of vandalism against hospitals and operations to interfere with vaccinations, by making medical appointments without honoring them, for example.

See also  H1 Headings For SEO – Why They Matter

Change on Facebook

Facebook announces news that will facilitate your sales and purchases on the social network.

Mark Zuckerberg, the boss of Facebook, announced that the parent company would now be called Meta, to better represent all of its activities, from social networks to virtual reality, but the names of the different services will remain unchanged. A month later, Meta is already announcing news for the social network.

The first is the launch of online stores in Facebook groups. A “Shop” tab will appear and will allow members to buy products directly through the group in question.

Other features have been communicated with the aim of facilitating e-commerce within the social network, such as the display of recommendations and a better mention of products or even Live Shopping. At this time, no date has been announced regarding the launch of these new options.

In the light of recent features, the company wants to know the feedback from its users through the survey same like what Tesco doing to get its customers feedback via Tesco Views Survey. However, the company is still about this feedback will announce sooner than later in this regard.

Continue Reading

DON'T MISS ANY IMPORTANT NEWS!
Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address

Trending