Connect with us


Social media CEOs hedge on whether they’d boot the 12 anti-vax ‘super-spreaders’ cited by states’ attorneys general



On Wednesday, a coalition of a dozen state attorneys general called on Facebook and Twitter to step up their enforcement of their community guidelines to curtail the spread of COVID-19 vaccine misinformation on their platforms. Their letter specifically identified 12 “anti-vaxxer” accounts that were responsible for a sizable 65% of public anti-vaccine content on Facebook, Instagram and Twitter. In today’s House hearing on disinformation and extremism, Twitter and Facebook’s CEOs, along with Google CEO Sundar Pichai, were directly asked if they would be willing to take down these 12 accounts.

Their answers were a mixed bag and a demonstration of social media execs’ unwillingness to take a simple action — taking down a handful of disinformation sources — that could have a significant impact on Americans’ willingness to get vaccinated to end the pandemic.

Over the course of the hearing, Congressman Mike Doyle (D-PA) pointed out that nearly 550,000 Americans had lost their lives to COVID-19, and an independent study found that Facebook users in five countries, including the U.S., had been exposed to COVID-19 disinformation 3.8 billion times. Now that the U.S. is rushing to get shots into people’s arms to reduce the spread of the deadly virus, it’s still having to deal with social media sites continuing to promote and recommend content leading to vaccine hesitancy.

“My staff found content on YouTube telling people not to get vaccines, and was recommended to similar videos. The same was true on Instagram, where it was not only easy to find vaccine disinformation, but platforms recommended similar posts,” said Doyle. “The same thing happened on Facebook, except they also had anti-vax groups to suggest, as well. And Twitter was no different.”

See also  Facebook co-workers now 'Metamates' as image evolves

“You can take this content down,” Doyle said. “You can reduce the vision. You can fix this, but you choose not to,” he told the CEOs.

He later directly asked the CEOs if they would be willing to take down the 12 accounts the attorneys general had identified in their letter as the so-called “super-spreaders” of misinformation.

The coalition had written that both Facebook and Twitter had yet to remove the accounts of 12 prominent anti-vaxxers, who repeatedly violated the company’s terms of service. These users’ accounts, associated organizations, groups and websites were responsible for 65% of public anti-vaccine content across Facebook, Twitter and Instagram, as of March 10, the letter noted.

In response to the question of taking down these dozen accounts, Zuckerberg hedged. He said that Facebook’s team would have to first look at the exact examples being referenced, leading to Doyle cutting him off.

Pichai tried to start his answer by noting that YouTube had removed more than 850,000 videos with misleading coronavirus information, but was also cut off as Doyle re-asked the question as to whether or not YouTube would take down the accounts of the 12 super-spreaders.


“We have policies to take down content,” Pichai said, but added that “some of the content is allowed, if it’s people’s personal experiences.”

When Twitter CEO Jack Dorsey was posed the same question, he said, “yes, we remove everything against our policy” — a better answer, but also one that’s not necessarily a confirmation that Twitter would, indeed, remove those specific 12 accounts.

See also  Your Simple Guide to Twitter #Hashtags

Dorsey, earlier in the hearing, had also spoken broadly about Twitter’s long-term vision for dealing with misinformation, “Bluesky,” its vision for a decentralized future. He explained how Bluesky would leverage a base, open-source protocol that’s shared, allowing for “increased innovation around business models, recommendation algorithms, and moderation controls which are placed in the hands of individuals, rather than private companies,” Dorsey said. The answer indicated Twitter’s vision for moderation was ultimately about handing off the responsibility to others — something Facebook has also done in recent months with its Oversight Committee, an external body that will weigh in on the hardest moderation decisions.

These moves indicate that social networks have decided for themselves that they’re not capable of handling the responsibilities of content moderation on their own. But whether the U.S. government will actually step in to regulate them as result still remains to be seen.



Facebook fighting against disinformation: Launch new options



Meta, the parent company of Facebook, has dismantled new malicious networks that used vaccine debates to harass professionals or sow division in some countries, a sign that disinformation about the pandemic, spread for political ends, is on the wane not.

“They insulted doctors, journalists and elected officials, calling them supporters of the Nazis because they were promoting vaccines against the Covid, ensuring that compulsory vaccination would lead to a dictatorship of health,” explained Mike Dvilyanski, director investigations into emerging threats, at a press conference on Wednesday.

He was referring to a network linked to an anti-vaccination movement called “V_V”, which the Californian group accuses of having carried out a campaign of intimidation and mass harassment in Italy and France, against health figures, media and politics.

The authors of this operation coordinated in particular via the Telegram messaging system, where the volunteers had access to lists of people to target and to “training” to avoid automatic detection by Facebook.

Their tactics included leaving comments under victims’ messages rather than posting content, and using slightly changed spellings like “vaxcinati” instead of “vaccinati”, meaning “people vaccinated” in Italian.

The social media giant said it was difficult to assess the reach and impact of the campaign, which took place across different platforms.

This is a “psychological war” against people in favor of vaccines, according to Graphika, a company specializing in the analysis of social networks, which published Wednesday a report on the movement “V_V”, whose name comes from the Italian verb “vivere” (“to live”).

“We have observed what appears to be a sprawling populist movement that combines existing conspiratorial theories with anti-authoritarian narratives, and a torrent of health disinformation,” experts detail.


They estimate that “V_V” brings together some 20,000 supporters, some of whom have taken part in acts of vandalism against hospitals and operations to interfere with vaccinations, by making medical appointments without honoring them, for example.

See also  Facebook launches Neighborhoods, a Nextdoor clone

Change on Facebook

Facebook announces news that will facilitate your sales and purchases on the social network.

Mark Zuckerberg, the boss of Facebook, announced that the parent company would now be called Meta, to better represent all of its activities, from social networks to virtual reality, but the names of the different services will remain unchanged. A month later, Meta is already announcing news for the social network.

The first is the launch of online stores in Facebook groups. A “Shop” tab will appear and will allow members to buy products directly through the group in question.

Other features have been communicated with the aim of facilitating e-commerce within the social network, such as the display of recommendations and a better mention of products or even Live Shopping. At this time, no date has been announced regarding the launch of these new options.

In the light of recent features, the company wants to know the feedback from its users through the survey same like what Tesco doing to get its customers feedback via Tesco Views Survey. However, the company is still about this feedback will announce sooner than later in this regard.

Continue Reading

Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address