Connect with us


Can Effective Regulation Reduce the Impact of Divisive Content on Social Networks?



Amid a new storm of controversy sparked by The Facebook Files, an expose of various internal research projects which, in some ways, suggest that Facebook isn’t doing enough to protect users from harm, the core question that needs to be addressed is often being distorted by inherent bias and specific targeting of Facebook, the company, as opposed to social media, and algorithmic content amplification as a concept.

That is, what do we do to fix it? What can be done, realistically, that will actually make a difference; what changes to regulation or policy could feasibly be implemented to reduce the amplification of harmful, divisive posts that are fueling more angst within society as a result of the increasing influence of social media apps?

It’s important to consider social media more broadly here, because every social platform uses algorithms to define content distribution and reach. Facebook is by far the biggest, and has more influence on key elements, like news content – and of course, the research insights themselves, in this case, came from Facebook.

The focus on Facebook, specifically, makes sense, but Twitter also amplifies content that sparks more engagement, LinkedIn sorts its feed based on what it determines will be most engaging. TikTok’s algorithm is highly attuned to your interests.

The problem, as highlighted by Facebook whistleblower Frances Haugen is algorithmic distribution, not Facebook itself – so what ideas do we have that can realistically improve that element?

And the further question then is, will social platforms be willing to make such changes, especially if they present a risk to their engagement and user activity levels?

Haugen, who’s an expert in algorithmic content matching, has proposed that social networks should be forced to stop using engagement-based algorithms altogether, via reforms to Section 230 laws, which currently protect social media companies from legal liability for what users share in their apps.

As explained by Haugen:

“If we had appropriate oversight, or if we reformed [Section] 230 to make Facebook responsible for the consequences of their intentional ranking decisions, I think they would get rid of engagement-based ranking.”

The concept here is that Facebook – and by extension, all social platforms – would be held accountable for the ways in which they amplify certain content. So if more people end up seeing, say, COVID misinformation because of algorithmic intervention, Facebook could be held legally liable for any impacts.

That would add significant risk to any decision-making around the construction of such algorithms, and as Haugen notes, that may then see the platforms forced to take a step back from measures which boost the reach of posts based on how users interact with such content.

Essentially, that would likely see social platforms forced to return to pre-algorithm days, when Facebook and other apps would simply show you a listing of the content from the pages and people you follow in chronological order, based on post time. That, in turn, would then reduce the motivation for people and brands to share more controversial, engagement-baiting content in order to play into the algorithm’s whims.

See also  Are virtual concerts here to stay?

The idea has some merit – as various studies have shown, sparking emotional response with your social posts is key to maximizing engagement, and thus, reach based on algorithm amplification, and the most effective emotions, in this respect, are humor and anger. Jokes and funny videos still do well on all platforms, fueled by algorithm reach, but so too do anger-inducing hot takes, which partisan news outlets and personalities have run with, which could well be a key source of the division and angst we now see online.

To be clear, Facebook cannot solely be held responsible for such. Partisan publishers and controversial figures have long played a role in broader discourse, and they were sparking attention and engagement with their left-of-center opinions long before Facebook arrived. The difference now is that social networks facilitate such broad reach, while they also, through Likes and other forms of engagement, provide direct incentive for such, with individual users getting a dopamine hit by triggering response, and publishers driving more referral traffic, and gaining more exposure through provocation.

Really, a key issue in when considering the former outcome is that everyone now has a voice, and when everyone has a platform to share their thoughts and opinions, we’re all far more exposed to such, and far more aware. In the past, you likely had no idea about your uncle’s political persuasions, but now you know, because social media reminds you every day, and that type of peer sharing is also playing a role in broader division.

Haugen’s argument, however, is that Facebook incentivizes this – for example, one of the reports Haugen leaked to the Wall Street Journal outlines how Facebook updated its News Feed algorithm in 2018 to put more emphasis on engagement between users, and reduce political discussion, which had become an increasingly divisive element in the app. Facebook did this by changing its weighting for different types of engagement with posts.

Facebook algorithm diagram

The idea was that this would incentivize more discussion, by weighting replies more heavily – but as you can imagine, by putting more value on comments, in order to drive more reach, that also prompted more publishers and Pages to share increasingly divisive, emotionally-charged posts, in order to incite more reactions, and get higher share scores as a result. With this update, Likes were no longer the key driver of reach, as they had been, with Facebook making comments and Reactions (including ‘Angry’) increasingly important. As such, sparking discussion around political trends actually became more prominent, and exposed more users to such content in their feeds.

The suggestion then, based on this internal data, is that Facebook knew this, it knew that this change had ramped up divisive content. But they opted not to revert back, or implement another update, because engagement, a key measure for its business success, had indeed increased as a result.

See also  Google Announces New Advertiser Insights, 'Performance Max' Automated Campaigns

In this sense, removing the algorithm motivation would make sense – or maybe, you could look to remove algorithm incentives for certain post types, like political discussion, while still maximizing the reach of more engaging posts from friends, catering to both engagement goals and divisive concerns.

That’s what Facebook’s Dave Gillis, who works on the platform’s product safety team has pointed to in a tweet thread, in response to the revelations.

As per Gillis:

At the end of the WSJ piece about algorithmic feed ranking, it’s mentioned – almost in passing – that we switched away from engagement-based ranking for civic and health content in News Feed. But hang-on – that’s kind of a big deal, no? It’s probably reasonable to rank, say, cat videos and baby photos by likes etc. but handle other kinds of content with greater care. And that is, in fact, what our teams advocated to do: use different ranking signals for health and civic content, prioritizing quality + trustworthiness over engagement. We worked hard to understand the impact, get leadership on board – yep, Mark too – and it’s an important change.

This could be a way forward, using different ranking signals for different types of content, which may work to enable optimal amplification of content, boosting beneficial user engagement, while also lessening the motivation for certain actors to post divisive material in order to feed into algorithmic reach.

Would that work? Again, it’s hard to say, because people would still be able to share posts, they’d still be able to comment and re-distribute material online, there are still many ways that amplification can happen outside of the algorithm itself.

In essence, there are merits to both suggestions, that social platforms could treat different types of content differently, or that algorithms could be eliminated to reduce the amplification of such material.

And as Haugen notes, focusing on the systems themselves is important, because content-based solutions open up various complexities when the material is posted in other languages and regions.

“In the case of Ethiopia, there are 100 million people and six languages. Facebook only supports two of those languages for integrity systems. This strategy of focusing on language-specific, content-specific systems for AI to save us is doomed to fail.”

Maybe, then, removing algorithms, or at least changing the regulations around how algorithms operate, would be an optimal solution, which could help to reduce the impacts of negative, rage-inducing content across the social media sphere.

But then we’re back to the original problem that Facebook’s algorithm was designed to solve – back in 2015 Facebook explained that it needed the News Feed algorithm not only to maximize user engagement, but also to help ensure that people saw all the updates of most relevance to them.

See also  How to Improve Your SEO Strategy Using Anchor Text Optimization [Infographic]

As it explained, the average Facebook user, at that time, had around 1, 500 posts eligible to appear in their News Feed on any given day, based on Pages they’d liked and their personal connections – while for some more active users, that number was more like 15,000. It’s simply not possible for people to read every single one of these updates every day, so Facebook’s key focus with the initial algorithm was to create a system that uncovered the best, most relevant content for each individual, in order to provide users with the most engaging experience, and subsequently keep them coming back.

As Facebook’s chief product officer Chris Cox explained to Time Magazine:

“If you could rate everything that happened on Earth today that was published anywhere by any of your friends, any of your family, any news source, and then pick the 10 that were the most meaningful to know today, that would be a really cool service for us to build. That is really what we aspire to have News Feed become.”

The News Feed approach has evolved a lot since then, but the fundamental challenge that it was designed to solve remains. People have too many connections, they follow too many Pages, they’re members of too many groups to get all of their updates, every day. Without the feed algorithm, they will miss relevant posts, relevant updates like family announcements and birthdays, and they simply won’t be as engaged in the Facebook experience.

Without the algorithm, Facebook will lose out, by failing to optimize for audience desires – and as highlighted in another of the reports shared as part of the Facebook Files, it’s actually already seeing engagement declines in some demographic subsets.

Facebook engagement over time

You can imagine that if Facebook were to eliminate the algorithm, or be forced to change its direction on this, that this graph will only get worse over time.

Zuck and Co. are therefore not likely to be keen on that solution, so a compromise, like the one proposed by Gillis, may be the best that can be expected. But that comes with its own flaws and risks.   

Either way, it is worth noting that the focus of the debate needs to shift to algorithms more broadly, not just on Facebook alone, and whether there is actually a viable, workable way to change the incentives around algorithm-based systems to limit the distribution of more divisive elements.

Because that is a problem, no matter how Facebook or anyone else tries to spin it, which is why Haugen’s stance is important, as it may well be the spark that leads us to a new, more nuanced debate around this key element.

Continue Reading


Germany weighs ban on Telegram, tool of conspiracy theorists



Germany has seen regular, sometimes violent, protests against Covid-related government restrictions

Germany has seen regular, sometimes violent, protests against Covid-related government restrictions – Copyright AFP Wakil KOHSAR


The German government is considering a ban on encrypted messaging app Telegram after it was repeatedly used as a channel for spreading anti-vaccine conspiracy theories and even death threats.

The app has also played a key role in mobilising turnout at some of the most violent protests in opposition to the German government’s Covid-19 policies since the start of the pandemic.

With parliament due to begin debating compulsory vaccination on Wednesday, authorities fear that the controversial issue could risk firing up another wave of rage.

With this in mind, politicians have set their sights on tighter controls on Telegram.

Interior Minister Nancy Faeser will unveil plans by Easter to require the app to delete messages that contain death threats or hate speech and identify their authors.

If Telegram fails to comply, the government could even ban the service completely.

“We will ensure that those spreading hate are identified and held accountable,” Faeser told the Bundestag lower house of parliament in mid-January.

She also told Die Zeit newspaper that Telegram could be deactivated in Germany if it failed to comply with local laws and “all other options have failed”.

Telegram chat groups, which can include up to 200,000 members, have been used by some anti-vaccine protesters to share false information and to encourage violence against politicians.

In December, German police seized weapons during raids in the eastern city of Dresden after a Telegram group was used to share death threats against a regional leader.

See also  Are virtual concerts here to stay?

The same month, Telegram was used to mobilise a group of coronavirus-sceptics to mass outside the house of Petra Koepping, the health minister of Saxony state, armed with flaming torches.

A message viewed by 25,000 people had called for people opposing Covid restrictions to share private addresses of German “local MPs, politicians and other personalities” who they believed were “seeking to destroy” them through pandemic curbs.

– New avenues –

At the height of a refugee crisis that erupted in 2015, online social networking tools Facebook and Twitter fell foul of the authorities as they were seized by the far-right to spread virulent anti-immigrant content.

In 2017, Germany passed a controversial law that requires the social network giants to remove illegal content and report it to the police.

Facebook said in September it had deleted accounts, pages and groups linked to the “Querdenker” (Lateral Thinkers), a movement that has emerged as the loudest voice against the German government’s coronavirus curbs.

But that pushed opposing voices to other platforms, with Telegram emerging as the app of choice.

“Since the big platforms like Facebook no longer allow racist, anti-Semitic hate and far-right content like Holocaust denial, people who want to spread this are looking for new avenues,” Simone Rafael, digital manager for the Amadeu Antonio anti-racism foundation, told AFP.

“Currently, the most popular one in Germany is Telegram,” Rafael said.

While Facebook has an interest in maintaining a presence in Germany and has gradually submitted to national legislation, this is not the case with Telegram, the expert said.

See also  Signal’s Brian Acton talks about exploding growth, monetization and WhatsApp data-sharing outrage

“Telegram is not cooperating with the judicial or security authorities, even on indisputably punishable and reprehensible matters such as child pornography,” a behaviour that “deprives the state of any capacity for action”, Rafael said.

With Telegram not budging, German federal police are even planning to start flooding the company with requests for content deletion to push it into action, reported Die Welt daily.

– ‘Very bad signal’ –

One option for the government could be to require Google or Apple to remove Telegram from their app stores. However, this would not affect users who have already downloaded the app.

For Rafael, the only solution is to ban the app completely.

That would make Germany the first Western country to outlaw Telegram, created in 2013 by Russian brothers Nikolai and Pavel Durov, two opponents of Russian President Vladimir Putin who sought to avoid surveillance by their country’s secret services.

The company is currently headquartered in Dubai, with its parent group in British Virgin Islands.

Telegram is already banned or heavily regulated in China, India and Russia.

But a move against the app could also spark further dissent in Germany.

Such a drastic step would “send a very bad signal”, according to digital journalist Markus Reuter.

“On the one hand we are celebrating Telegram’s lack of censorship and its importance for democratic movements in Belarus and Iran, and on the other, we are then disabling the service here” in Germany, he said.

Source link

Continue Reading


Meta Looks to Sell Off its Diem Cryptocurrency Project Due to Ongoing Challenges and Restrictions



Meta Looks to Sell Off its Diem Cryptocurrency Project Due to Ongoing Challenges and Restrictions

After three years of development, and a raft of changes to its name, scope, leadership and purpose, it seems like Meta’s trouble cryptocurrency project may soon come to an end.

According to reports, Meta’s looking to pull the pin on its Diem/Novi crypto project, with company representatives seeking expressions of interest on its sale.

As reported by Bloomberg:

The Diem Association, a cryptocurrency initiative once known as Libra backed by Meta Platforms Inc., is weighing a sale of its assets as a way to return capital to its investor members, according to people familiar with the matter. Diem is in discussions with investment bankers about how best to sell its intellectual property and find a new home for the engineers who developed the technology, cashing out whatever value remains in its once-ambitious Diem coin venture, said the people, asking not to be identified because the discussions aren’t public.”

That would be a significant step back for Meta, which launched its original Libra crypto project to much fanfare in 2019, with former PayPal chief David Marcus at the helm of the initiative.

But the project saw strong resistance from the start.

Maybe it’s because it was coming from Meta (then Facebook), or maybe it was due to widespread distrust of crypto projects, but many regions came straight out and said that they would not support the company creating its own currency. The public backlash saw many of the initial big-name backers pull out, including Visa, Mastercard and PayPal, all key names which had leant credibility to the initial concept.

That, already, put the entire experiment in limbo, because without the support of major financial institutions, Meta’s options for the project were limited. It seemed, then, that the project would likely fade away, but then in May 2020, Meta announced that it was changing the name of its crypto wallet from Calibra to Novi.

See also  Twitter explains how it will handle misleading tweets about the US election results

In October last year, after a long period of no news, then Novi chief David Marcus announced the next major step forward for the project, with the launch of a pilot of its Novi digital wallet in the US and Guatemala, enabling users to send and receive money between the two regions.


That was the first concrete steps we’d seen in making the project a workable reality, but still, many regions are still very skeptical of cryptocurrency, and with India, in particular, moving to ban cryptocurrencies outright, the value of the project was also lessened, potentially to the point where it’s now no longer viable in broader terms.

India, it’s worth noting, is where Meta sees the most potential for money transfers and eCommerce, as it looks to cement its apps as key connective tools in the emerging market.

Shortly after the launch of the Novi payments pilot, David Marcus left the project, and maybe that was the final flag, the signal that it was just never going to make it.

Which now leads to these new reports, that Meta’s looking to sell it off – though it is worth noting that the reports suggest that Meta is looking to get out of the Diem Association, the parent group overseeing the project, with no mention of the Novi payment project specifically.

I would assume that they are intrinsically tied together, but maybe there’s a way for Meta to continue to support and develop its Novi payments option independently, though that does seem like a stretch.

So what would a sale of Diem, and the failure of Meta’s crypto push, mean for crypto more broadly?

See also  Pro Tips: TikTok Shares Advice on How Brands Can Establish a Presence on the Platform, and Generate Results

I mean, resistance is steadily growing for cryptocurrencies in general, with more regions now moving to cut them off completely, including Russia, China and Indonesia in recent months. A report published the Library Law of Congress late last year showed that the number of countries and jurisdictions that have either banned or restricted cryptocurrencies has more than doubled since 2018, due to concerns around scam activity, market price fluctuations, and environmental impacts as a result of crypto ‘mining’.

Yet, at the same time, many western nations are seeing a boom in sales of crypto-aligned projects like NFTs, and with Web3 advocates essentially tying the growing tech movement into crypto development, it is actually gaining momentum in some circles, despite concurrently rising concerns.

But as noted, western markets are not where Meta saw the most potential value in its crypto project. Meta’s real aim has been to build native, in-stream payments into its apps, in order to further embed their use into developing markets, like India and Indonesia. Both of these regions see high remittance activity, with people transferring money back to family, and Meta originally saw Diem as a vehicle for removing fees from such exchanges, which would then get more people moving their money through Facebook and WhatsApp.

And once they’re already shifting their money around in its apps, that would make it much easier for Meta to parlay that behavior into eCommerce, expanding utility, and importance, for the millions of people in these markets.

But it does seem like that’s not to be – and given that, it makes sense for Meta to move on.

See also  Facebook Adds New Tools and Guides for Coming Out Day 2020

Though it’s not a great endorsement for the potential of crypto, in a broader sense. If Meta, with all of its resources and influence, can’t find a real use case for crypto, does that suggest that its potential value is not as high as some advocates think?

That might be a stretch, but as more regions move to ban crypto projects, and more big players step away from the sector, it does seem like the challenges are rising, which could, eventually, put the brakes on the entire crypto movement.    

Source link

Continue Reading


China restricts activists’ social media ahead of Olympics



Multiple Chinese activists have seen their WeChat accounts restricted or disabled entirely in the lead-up to the Winter Olympics in Beijing

Multiple Chinese activists have seen their WeChat accounts restricted or disabled entirely in the lead-up to the Winter Olympics in Beijing – Copyright AFP Kirill KUDRYAVTSEV

Laurie CHEN

Human rights activists and some academics in China have had their WeChat messaging app accounts restricted in recent weeks, multiple people affected have told AFP, as Beijing cracks down on dissent before the Winter Olympics.

China hopes to make next week’s Games a soft power triumph, although the lead-up has seen some Western powers launch a diplomatic boycott over Beijing’s rights record and cybersecurity firms warn athletes of digital surveillance risks.

For China’s ever-dwindling community of activists, the imminent arrival of the world’s best athletes has triggered a familiar clampdown.

Eight individuals told AFP that their WeChat accounts had been restricted in some form since early December, with some unable to use their accounts entirely and forced to re-register.

The restrictions came as authorities detained two prominent human rights activists, lawyer Xie Feng and writer Yang Maodong, while a third rights lawyer missing since early December is believed by relatives to be in secret detention.

“This storm of shuttering WeChat accounts is too strong and unprecedented,” said veteran journalist Gao Yu, whose account had features like group chat messaging permanently disabled for the first time on December 20.

China routinely suppresses the social media accounts and physical movements of dissidents during politically sensitive periods such as Communist Party gatherings in Beijing or key anniversaries like the 1989 Tiananmen crackdown.

A major Party Congress will take place towards the end of this year when President Xi Jinping, China’s most authoritarian leader in a generation, is expected to further cement his rule with a third term.

See also  TikTok Stands Firm as White House Makes September Deal Deadline Official

The arrival of the Winter Olympics has presaged a clampdown similar to those surrounding other major events.

“The government now wants to make sure that people don’t cross the line online to poke the facade of a perfect Winter Olympic Games,” said Yaqiu Wang, senior China researcher at Human Rights Watch.

– Ubiquitous app –

Tencent’s app WeChat is a mainstay of daily life in China, with users relying on it for a range of services including payments and scanning health codes that permit entry to public venues.

“I know many people who’ve been banned from posting in group chats or posting WeChat Moments lately,” a Beijing lawyer whose account was restricted last month said on condition of anonymity.

Beijing-based writer Zhang Yihe said her WeChat group chat and Moments functions — similar to Facebook’s Wall or Instagram Stories — were restricted on January 8.

Tsinghua University sociology professor Guo Yuhua confirmed her account was permanently blocked the same day, while prominent legal scholar He Weifang said he encountered the same on January 9.

“Isn’t this equal to removing an individual from a public space?” said Zhang, adding she can now only send WeChat messages to individual users.

“Before and during the Olympics is a major sensitive period,” added a Beijing-based activist whose account was restricted twice in the past two months.

Tencent, the owner of WeChat, did not respond to a request for comment.

– Offline crackdown –

In recent weeks, Chinese police have detained two prominent rights activists on suspicion of “inciting state subversion”, according to official notices shared with AFP.

See also  Twitter explains how it will handle misleading tweets about the US election results

One of them, Yang Maodong, was unable to reunite with his wife in the United States before her death in early January.

Relatives of Tang Jitian, a human rights lawyer who vanished last month en route to an EU Human Rights Day event in Beijing, told AFP they believe he is being held under a form of secret detention commonly used against dissidents, possibly in his home province of Jilin.

“We don’t know where he is. I’ve reported him missing to the police but with no result,” said a relative who did not wish to be identified for fear of reprisal.

“They said it doesn’t meet the requirements for filing a (missing persons) case and that he had scanned the Jilin province health code.”

People arrested for national security offences in China can disappear for months at a time into incommunicado detention before authorities charge them or reveal their fate.

Both Jilin and Beijing’s public security bureaus did not respond to requests for comment.

The International Olympic Committee said in an emailed response that it “has neither the mandate nor the capability to change the laws or the political system of a sovereign country”, adding that it “must remain neutral on all global political issues”.

Beijing Games organisers told AFP they “oppose the politicisation of sports” and were “not aware of these matters”.

Meanwhile, those still free lament mounting restrictions on speech under the current political climate.

“The space for public discourse is getting smaller and smaller,” said He.

Source link

Continue Reading

Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address