New Algorithms Could Reduce Polarization Driven By Information Overload

Computer scientists propose systemic changes in automatic content curation

July 30, 2020

Image

As the volume of available information expands, the fraction a person is able to absorb shrinks. They end up retreating into a narrow slice of thought, becoming more vulnerable to misinformation, and polarizing into isolated enclaves of competing opinions. To break this cycle, computer scientists say we need new algorithms that prioritize a broader view over fulfilling consumer biases.

“This is a call to arms,” said Boleslaw Szymanski, a professor of computer science at Rensselaer Polytechnic Institute. “Informed citizens are the foundation of democracy, but the interest of big companies, who supply that information, is to sell us a product. The way they do that on the internet is to repeat what we showed interest in. They’re not interested in a reader’s growth; they’re interested in the reader’s continued attention.”

Szymanski and colleagues at the University of Illinois at Urbana Champaign, the University of California, Los Angeles, and the University of California, San Diego, explore this troubling “paradox of information access,” in a paper published on arXiv.org.

“You would think that enabling everybody to be an author would be a blessing,” said Szymanski, an expert in social and cognitive networks, with previous work that includes findings on the power of a committed minority to sway outcomes. “But the attention span of human beings is not prepared for hundreds of millions of authors. We don’t know what to read, and since we cannot select everything, we simply go back to the familiar, to works that represent our own beliefs.”

Nor is the effect entirely unprecedented, said Tarek Abdelzaher, a professor and University of Illinois at Urbana Champaign lead on the project.

“It’s not the first time that affordances of connectivity and increased access have led to polarization,” said Abdelzaher. “When the U.S. interstate freeway system was built, urban socioeconomic polarization increased. Connectivity allowed people to self-segregate into more homogenous sprawling neighborhoods. The big question this project answers is: how to undo the polarizing effects of creating the information super-highway?”

The effect is exacerbated when our own human limitations are combined with information curations systems that maximize “clicks.”

To disrupt this cycle, the authors contend that the algorithms that provide a daily individualized menu of information must be changed from systems that merely “give consumers more of what these consumers express interest in.”

The authors propose adapting a technique long used in conveying history, which is to provide a tighter summation for events further back from the present day. They call this model for content curation “a scalable presentation of knowledge.” Algorithms would shift from “extractive summarization,” which gives us more of what we consumed in the past, to “abstractive summarization,” which increases the proportion of available thought we can digest.

“As long as you do balance content, you can cover more distant knowledge in much less space,” said Szymanski, who is also the director of a Network Science and Technology Center at Rensselaer. “Although readers have a finite attention span, they still have a slight knowledge in new areas, and then they can choose to shift their attention in a new direction or stay the course.”

Few analytical models exist to measure the trend toward what the authors call “ideological fragmentation in an age of democratized global access.” But one, which the authors considered, treats individuals as “particles in a belief space” — almost like a fluid — and measures their changing positions based on the change in content they share over time. The model “confirms the emergence of polarization with increased information overload.”

The more ideologically isolated and polarized we are, the more we are vulnerable to disinformation tailored to reinforce our own biases. Szymanski and his colleagues offer a slew of technical solutions to reduce misinformation, including better data provenance and algorithms that detect misinformation, such as internal consistency reasoning, background consistency reasoning, and intra-element consistency reasoning tools.

“The very sad development discussed in this paper is that today, people are not conversing with each other. We are living in our own universe created by the data which is coming from these summarization systems, data that confirms our innate biases,” Szymanski said. “This a big issue which we face as a democracy, and I think we have a duty to address it for the good of society.”

Szymanski and his co-authors are working on mathematical models that both measure the extent of polarization in various media, and predict how trends would change under various mitigating strategies.

“The Paradox of Information Access: Growing Isolation in the Age of Sharing” was published with support from the Defense Advanced Research Projects Agency, the Army Research Office, and the Office of Naval Research. Szymanski was joined in the research by Tarek Abdelzaher, Heng Ji, Jinyang Li, and Chaoqi Yang at the University of Illinois at Urbana Champaign, John Dellaverson and Lixia Zhang at the University of California, Los Angeles, and Chao Xu at the University of California, San Diego.

Written By Mary L. Martialay
Back to top