How polarized groups form when we care more about who is speaking than what is being said
Why do people who hold a specific view on the climate issue also tend to hold similar views on taxes or vaccinations, even though the topics are unrelated? A new study by researchers Fredrik Jansson and Anandi Hattiangadi presents a mathematical model showing how the mechanism of "source filtering" quickly divides a population into two opposing camps that disagree on almost everything.
Polarization is often explained in terms of group identity and distrust of the opposing side. However, in the new article The emergence of polarised groups through source filtering, published in Humanities & Social Sciences Communications, the researchers show that polarization can instead be a result of how we acquire and evaluate information. By developing a mathematical model, they have been able to simulate how opinions cluster together.
Assessing credibility through filters
The core of the research is the concept of source filtering. One way to look at how people assess the credibility of new information is that we evaluate it based on its content and whether it fits into our worldview. If it contradicts what we previously believed, we filter it out. This is known as content filtering. Source filtering works differently: here, we judge the credibility of the information based on who is presenting it.
- In many situations, it is difficult to determine for yourself whether a statement is true. In those cases, it becomes natural to instead ask whether you trust the person saying it, says Fredrik Jansson, researcher at the Institute for Futures Studies and author of the study.
Opinions create identities
In the researchers' mathematical model, people are represented as agents who meet and exchange ideas. If the agents filter information based on the source—that is, who is saying something—two large groups with opposing sets of opinions often emerge. This also causes opinions to become linked, creating identities. If someone holds a certain view, we can guess what other opinions they likely hold.
An important insight is that this happens even when the issues are initially completely independent of one another.
- The model shows that no logical connection between different issues is required for them to become linked. It is enough that people largely trust information from those who already think more or less like themselves, says Fredrik Jansson.
Similar to social media
The researchers suggest that this mechanism resembles how many digital platforms operate today. Early recommendation algorithms were based on actual content, but these were quickly overtaken by so-called collaborative filtering: content is not suggested based on what it is about, but based on what other users with similar behaviors like. In essence, a type of source filtering.
- It is often said that social media and algorithms reinforce polarization. Here, we present a possible explanation for how that process works, says Fredrik Jansson.
The study: "The emergence of polarised groups through source filtering" by Fredrik Jansson & Anandi Hattiangadi is published in Humanities & Social Sciences Communications (2026): https://www.nature.com/articles/s41599-025-06419-x
Contact
Fredrik Jansson, researcher at the Institute for Futures Studies, associate professor of mathematics at Mälardalen University, and researcher at the Centre for Cultural Evolution, Stockholm University. [email protected]