Facebook Announces $2 Million in Funding for Research Into Misinformation and Polarization Online
With Facebook’s enhanced election security rules already being tested just weeks into the official US Presidential campaign, The Social Network has this week announced a new funding initiative for research into a key area of concern in its political integrity process.
Via a new “2020 Foundational Integrity Research” project, Facebook is offering $2 million “in unrestricted gifts” to support independent research into misinformation and polarization, and how it’s distributed through social communication technologies.
As explained by Facebook:
“We will provide a total of $2,000,000 in funding for research proposals that aim to enrich our understanding of challenges related to misinformation, polarization, information quality, and conflict on social media and social technology platforms. Our goal for these awards is to support the growth of the scientific community in these spaces and to contribute to a shared understanding across the broader industry on how social technology companies can better address social issues on their platforms.”
Facebook’s program will aim to fund studies that highlight more effective ways to identify misinformation campaigns and their impacts, while it’s also looking to get a better understanding of how such campaigns are being used in different regions.
In particular, Facebook notes that it will focus on projects which emphasize:
“Comparative research and inclusion of non-Western regions that have experienced a growth in social media platform use, including South and Central America, Sub-Saharan and North Africa, the Middle East, and Central, South, and Southeast Asia. We encourage proposals from researchers, or collaborations with researchers, based in the country/countries being researched.”
This may help to provide more insight into the varying tactics used in each market, while it could also be used to help Facebook stay ahead of coming threats in emerging markets, where misinformation has not yet become as big of a concern as it has in more established regions.
Facebook says that it will not provide data for this research, and that any data collected by research teams must comply with Facebook’s terms and policies. Facebook also specifies that the research is not restricted to focusing on Facebook apps and technology, specifically.
“Award amounts will range from $50K to $150K. Most projects will be between $50K to $100K, with up to five awards of $150K.”
It’s difficult to know exactly how significant the impact of online misinformation and polarization is, and has been, with respect to various elections and shifts. Definitely, political division has increased in the age of social media, but the full cause and effect of such is challenging to diagnose because there aren’t always direct links between what a person sees online and how they then respond.
Some within Facebook have played down the impact of misinformation. Just recently, Facebook’s former mobile ads chief Andrew Bosworth noted that, in his opinion, most of the misinformation efforts during the 2016 US Presidential Election campaign came from people “with no political interest whatsoever” who were simply seeking to drive traffic to “ad-laden websites by creating fake headlines, and did so to make money”. Misinformation from candidates, Bosworth says, was not a significant factor, based on Facebook’s internal assessment.
And while we do know that Russian-based groups have attempted to influence public opinion in several nations, with a view to impacting their respective polls, Facebook has also played down the impacts of those efforts in a broader sense. But again, it’s impossible to know, definitively, what factors lead people to voting one way or another.
What we do know is that people are now more overt in their political leanings, which is likely due to their capacity to share more of their personal thoughts online, while we also know that an increasing amount of people now use social media – specifically Facebook – to get news content.
In this respect, it’s important, that we have some understanding of if and how fake news is having an impact in this chain, and how it can be addressed.
This new project should go some way towards providing more insight.
You can read more about Facebook’s new research grants here.