Connect with us

SOCIAL

Internal Research from Facebook Shows that Re-Shares Can Significantly Amplify Misinformation

Published

on

What if Facebook removed post shares entirely, as a means to limit the spread of misinformation in its apps? What impact would that have on Facebook engagement and interaction?

The question comes following the release of new insights from Facebook’s internal research, released as part of the broader ‘Facebook Files’ leak, which shows that Facebook’s own reporting found that post shares play a key role in amplifying misinformation, and spreading harm among the Facebook community.

As reported by Alex Kantrowitz in his newsletter Big Technology:

The report noted that people are four times more likely to see misinformation when they encounter a post via a share of a share – kind of like a retweet of a retweet – compared to a typical photo or link on Facebook. Add a few more shares to the chain, and people are five to ten times more likely to see misinformation. It gets worse in certain countries. In India, people who encounter “deep reshares,” as the researchers call them, are twenty times more likely to see misinformation.”

So it’s not direct shares, as such, but re-amplified shares, which are more likely to be the kinds of controversial, divisive, shocking or surprising reports that gain viral traction in the app. Content that generates emotional response sees more share activity in this respect, so it makes sense that the more radical the claim, the more re-shares it’ll likely see, particularly as users look to either refute or reiterate their personal stance on issues via third party reports.

And there’s more:

“The study found that 38% of all [views] of link posts with misinformation take place after two reshares. For photos, the numbers increase – 65% of views of photo misinformation take place after two reshares. Facebook Pages, meanwhile, don’t rely on deep reshares for distribution. About 20% of page content is viewed at a reshare depth of two or higher.

So again, the data shows that those more spicy, controversial claims and posts see significant viral traction through continued sharing, as users amplify and re-amplify these posts throughout Facebook’s network, often without adding their own thoughts or opinions on such.

So what if Facebook eliminated shares entirely, and forced people to either create their own posts to share content, or to comment on the original post, which would slow the rapid amplification of such by simply tapping a button?

Interestingly, Facebook has made changes on this front, potentially linked to this research. Last year, Facebook-owned (now Meta-owned) WhatsApp implemented new limits on message forwarding to stop the spread of misinformation through message chains, with sharing restricted to 5x per message.

Which, WhatsApp says, has been effective:

“Since putting into place the new limit, globally, there has been a 70% reduction in the number of highly forwarded messages sent on WhatsApp. This change is helping keep WhatsApp a place for personal and private conversations.”  

Which is a positive outcome, and shows that there is likely value to such limits. But the newly revealed research looked at Facebook specifically, and thus far, Facebook hasn’t done anything to change the sharing process within its main app, the core focus of concern in this report.

The company’s lack of action on this front now forms part of Facebook whistleblower Frances Haugen’s legal push against the company, with Haugen’s lawyer calling for Facebook to be removed from the App Store if it fails to implement limits on re-shares.

Facebook hasn’t responded to these new claims as yet, but it is interesting to note this research in the context of other Facebook experiments, which seemingly both support and contradict the core focus of the claims.

In August 2018, Facebook actually did experiment with removing the Share button from posts, replacing it with a ‘Message’ prompt instead.

Facebook Share button

That seemed to be inspired by the increased discussion of content within messaging streams, as opposed to in the Facebook app – but given the timing of the experiment, in relation to the study, it seems now that Facebook was looking to see what impact the removal of sharing could have on in-app engagement.

On another front, however, Facebook’s actually tested expanded sharing, with a new option spotted in testing that enables users to share a post into multiple Facebook groups at once.

Facebook share to groups prompt

That’s seemingly focused on direct post sharing, as opposed to re-shares, which were the focus of its 2019 study. But even so, providing more ways to amplify content, potentially dangerous or harmful posts, more easily, seems to run counter to the findings outlined in the report.

Again, we don’t have full oversight, because Facebook hasn’t commented on the reports, but it does seem like there could be benefit to removing post shares entirely as an option, as a means to limit the rapid re-circulation of harmful claims.

But then again, maybe that just hurts Facebook engagement too much – maybe, through these various experiments, Facebook found that people engaged less, and spent less time in the app, which is why it abandoned the idea.

This is the core question that Haugen raises in her criticism of the platform, that Facebook, at least perceptually, is hesitant to take action on elements that potentially cause harm if that also means that it could hurt its business interests.

Which, at Facebook’s scale and influence, is an important consideration, and one which we need more transparency on.

Facebook claims that it conducts such research with the distinct intent of improving its systems, as CEO Mark Zuckerberg explains:

If we wanted to ignore research, why would we create an industry-leading research program to understand these important issues in the first place? If we didn’t care about fighting harmful content, then why would we employ so many more people dedicated to this than any other company in our space – even ones larger than us? If we wanted to hide our results, why would we have established an industry-leading standard for transparency and reporting on what we’re doing?”

Which makes sense, but that doesn’t then explain whether business considerations factor into any subsequent decisions as a result, when a level of potential harm is detected by its examinations.

That’s the crux of the issue. Facebook’s influence is clear, its significance as a connection and information distribution channel is evident. But what plays into its decisions in regards to what to take action on, and what to leave, as it assesses such concerns?

There’s evidence to suggest that Facebook has avoided pushing too hard on such, even when its own data highlights problems, as seemingly shown in this case. And while Facebook should have a right to reply, and its day in court to respond to Haugen’s accusations, this is what we really need answers on, particularly as the company looks to make even more immersive, more all-encompassing connection tools for the future.

Socialmediatoday.com

SOCIAL

YouTube Adds Chat Emotes, New Shorts Editing Tools and Automated Audio Dubbing in Other Languages

Published

on

YouTube Adds Chat Emotes, New Shorts Editing Tools and Automated Audio Dubbing in Other Languages

YouTube has announced a range of new tweaks and updates, which are actually fairly significant, in different ways, but particularly if you’re looking to make Shorts a focus heading into the new year.

First off, YouTube’s giving Shorts creators to capacity to choose a frame from their clip as their thumbnail within the Shorts creation process, starting with Android users.

To be clear, creators can already choose a thumbnail for their Shorts within YouTube Studio, but this new process will make it easier to do so within the original upload flow, which could help to streamline the process.

To select a thumbnail frame for your Shorts (on Android):

  • Record or import a video with the Shorts camera then navigate to the final upload screen
  • Tap the pencil icon that is overlaid over the thumbnail of your video
  • Scrub along your video’s timeline to pick a thumbnail then hit ‘Done’
  • Upload your Short

YouTube says that it’s currently not possible to change the thumbnail after your Short has been uploaded, but it is looking to add this functionality in future.

This update is rolling out to all creators on Android from today.

And if you’re looking to make Shorts a bigger focus, this could also help – YouTube has launched a new series of Shorts mythbusting clips on the YouTube Creators channel, which covers various aspects of the Shorts process, including questions about the algorithm, common tips, best practices and more.

Worth a look.

On another front, YouTube has publicly launched its new automated system for overdubbing your YouTube content into another language.

Called ‘Aloud’, the new process, developed by YouTube’s ‘Area 120’ experimental project team, can take a video in English and translate it into several other languages, which YouTube says could be a great way to expand your audience reach.

As per YouTube:

“You can dub a video with Aloud in a couple of hours and it comes at no cost. This tool might be one of the easiest ways to expand your audience, because 80% of the world doesn’t speak English.”

Of course, you then have the speakers’ lips not matching up to the audio – like those foreign language films that you accidentally start watching on Netflix – but dependent on your content, that might not be a big deal

You can sign up for the waitlist on Aloud website to join the beta test pool for the option.

YouTube’s also launching a new chat stream engagement option called ‘YouTube Emotes’, which will enable viewers to share little graphics within their comments on clips.

Much like Twitch emotes, the additions provide another engagement option, to facilitate more expression within chat streams.

As explained by YouTube:

We’re starting with emotes created for Gaming but are working on bringing even more themes of emotes in the future, so stay tuned for emotes for even more communities.”

They’ll also, eventually, provide another subscription incentive option, with YouTube also noting that ‘channel membership custom emojis’ will soon be another option to choose from within the emotes set. On Twitch, exclusive channel emotes are only available to paying subscribers.

To use YouTube Emotes, you can click/tap on the smiley face icon in live chat or comments, which will then bring up a listing of all of the emotes and emojis available to you in that stream/thread.

On a related note, YouTube’s also launching a broader range of priced packages for Super Thanks (coming soon), in order to drive more revenue opportunities for creators, which is another way to engage within chat streams.

Finally, YouTube says that it’s expanding its comment warnings and user time-outs for repeated violations of comment rules, which it first launched in testing earlier this year.

Quite a few new updates from the ol’ YT, and some handy little additions that could play a significant role in your process over the holidays.

Source link

Continue Reading

DON'T MISS ANY IMPORTANT NEWS!
Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address

Trending

en_USEnglish