Facebook Plans for Post-Election Concerns in the US, Explores New Option for Combating Viral Misinformation

2020 has been a year like no other, and we still have a fair way to go in what could become an even more unpredictable roller coaster ride of emotions, issues and challenges.

A key concern in this respect is the upcoming US Election. This week, Joe Biden officially accepted his nomination as the Democratic nominee for the election, pitting him against current US President Donald Trump, in what’s likely to be one of the most divisive and provocative campaigns in history. 

Social media is likely to play a key role in this. After the Trump campaign essentially weaponized social platforms in the 2016 campaign, and foreign actors sought to interfere with the result through mass-manipulation efforts, much focus has been put on how Facebook and Twitter, in particular, will respond to such this time around. Those results have been mixed – and now, as the election draws ever-near, we’re seeing even more initiatives proposed that could have an impact on the eventual election result, or, indeed, its aftermath.

Here’s a look at the latest social platform updates related to the US Presidential Election.

Facebook Considers New Virality Signal in Slowing the Spread of Misinformation

First off, Facebook has this week said that it’s looking at a new way to detect viral misinformation before it gains significant traction by prioritizing reviews of viral content earlier, as it starts to gain momentum.

As reported by Casey Newton in The Interface:

“Facebook already shares information about the virality of new articles on the platform with its fact-checking partners, who use that data to help determine which articles to check first. [A new report] asks whether the company itself might want to set thresholds at which its own teams evaluate the content for community standards. If a video on Facebook gets 5 million views in an hour, shouldn’t someone at Facebook take a look at it?”

Essentially, the proposal, which Newton says Facebook is investigating, would see Facebook using share velocity as a key indicator in determining which posts should be reviewed for potential rule violations, which could stop viral misinformation from gaining mass-traction.

As as example of the problem, a recent video posted by Breitbart, which included a range of ‘health professionals’ criticizing official health advice around COVID-19, was viewed by millions of users on Facebook before the platform took it down. Facebook has been working hard to tackle COVID-19 misinformation, so it was somewhat surprising that the video was able to gain such traction before Facebook moved to address it.

This new proposal would theoretically help, in that it would mean that Facebook, by using share momentum as an indicator, would have reviewed and assessed the content faster, stopping its viral spread.

Which sounds good – but then again, why doesn’t Facebook already do this?

It seems crazy that Facebook, with all of its data and algorithms, all of the technical tools at its disposal, has never once thought to be like: “hey, maybe we should ensure we’re reviewing highly shared stuff before anything else”.

Of course, in reality, it probably has – which then raises the question as to why it wouldn’t have actioned such. Could it be that Facebook would actually prefer not to slow the spread of viral content, which brings more people to the platform, sparks more sharing, boosts engagement, etc.?

Such content is damaging, and it’s good to see Facebook considering more ways to address it. But the proposal itself raises questions about Facebook’s capacity – or willingness – to handle such within its process.

Facebook Explores Measures in Case Trump Disputes Election Result

Another interesting development, as reported by The New York Times, is that Facebook is exploring measures it might take in case President Trump decides not to accept the results of the 2020 election.

Trump, who has repeatedly criticized the integrity of the voting process, has thus far avoided questions about whether he will accept the final result, given his concerns about the poll’s accuracy. Of course, if Trump wins, you can bet that he’ll be more than happy to hug the American flag and accept the praise. But if Trump loses, what then?

Facebook, and other social platforms, are now concerned that they could be used by Trump to spread misinformation about the result, which could lead to civil unrest. 

As per NYT:

Facebook is preparing steps to take should Mr. Trump wrongly claim on the site that he won another four-year term, said the people, who spoke on the condition of anonymity. Facebook is also working through how it might act if Mr. Trump tries to invalidate the results by declaring that the Postal Service lost mail-in ballots or that other groups meddled with the vote, the people said.” 

One of Facebook’s ideas involves a “kill switch”, which would shut off political advertising after Election Day, thereby reducing the possibility of ads being used at scale to share misleading information about the result.   

Which is interesting, considering that Facebook’s stance right now is that politicians are essentially allowed to spread lies through Facebook ads, as they’re not subjected to fact checks.

So, lies in political ads are not damaging now, but they could be such a concern after the fact that they need to be halted completely?

It seems odd, but it does make some sense. Facebook won’t ban political ads right now, because it believes that doing so would only benefit the big players, while smaller politicians, who can use Facebook’s targeting options to reach more specific audiences at low cost, would lose that opportunity. The same principle applies after the result – the only people who stand to benefit from continuing to advertise after the campaign are the big players who can afford to do so, while smaller politicians will likely have stopped their spend by then.

As such, the move would be in line with Facebook’s broader stance on political advertising – but it does raise some significant concerns about the potential for dispute, and conflict, following the election result.

And while social platforms might be able to limit the reach of such messaging, the fact that this is now a real concern that they are considering is, in itself, a major problem.

As noted, 2020 could get a lot more unpredictable yet.

Trump Bans WeChat

And then there’s the ongoing discussion around the banning of Chinese originated apps, which has been dominated by the proposal to ban TikTok, or see it sold off to a US company.

But another social media-related impact within this is the potential ban of Chinese messaging app WeChat, which is hugely popular among Chinese expats, and would also be subject to a restriction under the Trump administration’s proposals.

WeChat, which has more than a billion active users, plays a key role in connecting many Chinese citizens to various functionalities, including paying bills, buying train tickets, shopping, etc. The app is not as widely used outside of China, though many businesses in many regions do facilitate the use of the app in making payments – and again, many Chinese expats, in particular, stay connected with family through the app.

Yet, even though usage outside China is comparatively low, a ban on the app could have major implications.

A key question right now is whether a ban in the US would mean that Apple and Google would then be disallowed from carrying the app in their global app stores, outside of the U.S. If they can’t, that could make it much harder for WeChat to grow, which would set up a new, and potentially significant dispute between the US and China. 

The Trump administration appears to now be suggesting that any ban would only relate to the US, and that even US owned companies would be able to continue using WeChat outside of America. 

That makes sense, but the expanded impacts of the proposal once again underline the complexity of such moves, and the broad-reaching effects that such decisions can have on how web-based properties work. 

For years we’ve been accelerating towards a more globally connected society, but now it seems we’re pulling back. That could be very difficult to enact, and could, as noted, lead to more global tension.

It’s amazing to consider the significance of the role that social media now plays in our everyday lives, which is represented in updates like these, where social platforms are now central to global political shifts. No one would have predicted such back when we were updating our MySpace ‘Top 8’, but this is where we are, and it’ll be interesting to see how things play out, and what role social platforms play in the next stage.     

Socialmediatoday.com

Spread the love

You May Also Like

About the Author: Entireweb