Facebook algorithm does not change users’ political beliefs (studies)

2023-07-27 23:12:00

Do the information bubbles that social media algorithms keep us in reinforce political polarization or simply reflect already existing divisions?

Meta, the company behind FacebookFoto: © Inkdropcreative1 | Dreamstime.com

A major research project conducted in the run-up to the 2020 US election has concluded that, contrary to what is often claimed, Facebook’s algorithm does not shape the political beliefs of its users, according to AFP.

This series of studies is the result of a collaboration between Meta – the parent company of Facebook and Instagram – and a group of researchers from American universities. The researchers had access to the company’s internal data and were able to conduct tests by altering users’ online experience.

In total, four studies were published in the scientific journals Science and Nature.

The algorithm “has a very important influence on people’s experience on the platform,” in other words what they see and how long they use it, said project leaders Talia Stroud of the University of Texas at Austin and Joshua Tucker of New York University.

But “changing the algorithm, even for a few months, is unlikely to change people’s political beliefs,” they added.

These beliefs were measured through questionnaires filled out by users following they participated in experiments that changed the content displayed on their home page.

The researchers acknowledged that the three-month period observed may not have been long enough to detect an effect, given that political polarization in the United States has been on the rise for decades.

Despite this, “these results defy the common discourse that blames social media information bubbles for contemporary problems in American democracy,” wrote the authors of one of the studies. (photo: Inkdropcreative1 | Dreamstime.com)

1690500364
#Facebook #algorithm #change #users #political #beliefs #studies

Leave a Replay