Skip to main content

Polarization and fake news in Meta

  • Tagged in:
  • Created the:
    10 August 2023

In order to understand the effect of social media on democratic electoral processes, Meta undertook a research project in collaboration with external partners, back in 2020. Based on this study conducted with 23,000 users from different platforms, different articles were published in the most renowned international scientific journals, Science and Nature. What were their conclusions?

Facebook and Fake News

According to Iproup, the first paper characterizes Facebook as "a social and informational environment that shows significant ideological segregation." It reveals a higher prevalence of unreliable content in right-leaning media compared to left-leaning ones. Therefore, individuals with more conservative political inclinations are more exposed to false information compared to those with more progressive opinions.

How do algorithms influence us?

The research also analyzed the differences between content displayed in chronological order and content driven by algorithms, both on Instagram and Facebook.

For both platforms, they found that the content from unreliable sources is more prominent in the chronological feed than in the algorithm-driven feed (by more than two-thirds in the case of Facebook and slightly less for Instagram). Furthermore, the chronological feed significantly reduced the time users spent on both platforms and their level of interaction with them.

Dissemination and Viralization

Finally, the research concluded that the function of sharing content already posted by others contributes to the dissemination of certain publications, although this action does not always lead to the viralization of a post. When users see content from similar sources in their feed, their level of engagement is higher.

Democracy at stake

Since the 2016 elections Facebook ― now Meta ― was accused of promoting misinformation and political polarization. The Spanish newspaper El País recalls four key moments:

  • After Donald Trump's victory, it was evident that the platform could be used to share content without restrictions and organize groups of people around certain positions, ideologies, or ideas, not necessarily democratic or respectful.
  • The "Mueller Report" ruled out that Trump had collaborated with Russia to win the presidential elections, but it concluded that Moscow had indeed interfered in the elections and Facebook had been one of its channels.
  • The data of more than 50 million Facebook users were used from 2014 without their consent to be illegally commercialized with third parties and profit from users' information, such as age, gender, preferences, and habits; the case of Cambridge Analytica is remembered.
  • Finally, an investigation by the U.S. justice system determined that the four tech giants - Facebook, Google, Amazon, and Apple - had "exploited their market power in an anticompetitive way," but no measures have been taken yet.

It is within this context that the published reports become more relevant. In response, the President of Global Affairs at Meta stated that "there is little evidence that the fundamental features of Meta's platforms, by themselves, cause harmful affective polarization or have significant effects on important political attitudes, beliefs, or behaviors."

According to Meta, when participants see less content that reinforces their views, they tend to interact more with content from like-minded ideas that were presented to them. However, one thing is clear: while there are progressives and conservatives on social media, they are not symmetrical groups: audiences consuming political news on Facebook generally have a right-leaning inclination.