Jaime Settle, the Cornelia Brackenridge Talbot Associate Professor of Government and Data Science at William & Mary, was a collaborator on the most comprehensive research project to date exploring the impact of social media on American democracy.
The first findings were released July 27 from the study on how critical aspects of algorithms that determine what people see in their Facebook and Instagram feeds affected what they believed during the 2020 U.S. presidential election period.
Settle was among a group of academics from U.S. colleges and universities that worked in collaboration with researchers at Meta, the company that owns and operates Facebook and Instagram, to produce an initial batch of papers that were peer-reviewed and published in the journals Science and Nature.
The research showed that algorithms are extremely influential in on-platform social media experiences, and there is significant ideological segregation in political news exposure. But changes to critical aspects of algorithms that determine what subjects saw did not sway political beliefs.
Settle remarked on the high quality of data available to do what she called “very rigorous social science” focused on data and democracy, two key initiatives of William & Mary’s Vision 2026 strategic plan. She says this large-scale project was an admirable place to advance the understanding of potential solutions to combat polarization on social media platforms.
“You have to start somewhere, and you have to explore what could work to tackle the problems we know exist,” said Settle, the associate director of data science at W&M and the director of the Social Networks and Political Psychology Lab. “Doing so could lead to some counterintuitive findings and unintended consequences because most of our really complex problems are going to have really complex solutions.
“I think a lot of times the public or pundits will latch on to an idea that they think is a panacea: if we could just fix this one big problem it would solve a lot of the other problems we have related to our political system or social media. However, the findings of these papers speak to the idea that there aren’t simple solutions.”
A great honor
The project has spanned more than three years and required intensive work from the likes of Settle.
“This was one of the great honors of my professional life,” Settle said. “I learned an incredible amount from my colleagues, both the other academic researchers as well as the data scientists and researchers at Meta. The sophistication of the processes and the methods and the thoroughness with which we were able to think things through was an exciting intellectual challenge, and it was fun to be a part of that.”
The team proposed and selected specific research questions and study designs with a clear agreement that the only reasons Meta could reject such designs would be for legal, privacy or infeasibility reasons. Meta could not restrict or censor findings, and the academic lead authors had final say over writing and research decisions.
Internal researchers at Meta initiated the partnership with Professor Talia Jomini Stroud, founder and director of the Center for Media Engagement at the University of Texas at Austin, and Professor Joshua A. Tucker, co-founder and co-director of the Center for Social Media and Politics at New York University and director of the Jordan Center for the Advanced Study of Russia.
The core research team consisted of 15 additional academic researchers, including Settle, with expertise in four areas – political polarization, political participation, (mis)information transmission online and beliefs about democratic norms and the legitimacy of democratic institutions.
Settle was a co-lead author on a study called “Like-minded Sources on Facebook Are Prevalent but Not Polarizing” along with professors Brendan Nyhan from Dartmouth, Emily Thorson from Syracuse and Magdalena Wojcieszak from University of California, Davis. The study presented data from 2020 for the entire population of active adult Facebook users in the U.S., showing that content from politically like-minded sources constitutes the majority of what people see on the platform, but political information and news represent only a small fraction of these exposures.
“This speaks to this concern about echo chambers,” Settle said. “Everyone believes that we’re so polarized and we all dislike each other because we are in bubbles where we only encounter points of view that are similar to our own. What we show is that randomly assigning some people to encounter less content from like-minded sources didn’t have these beneficial effects that some people thought it would.”
Settle said her research team didn’t just downrank political content; it downranked all content that came from like-minded sources. “There is a lot more to politics on social media than just explicitly political content,” she said. “Even non-political content can send an important signal about other people’s political identities. And since political content is such a small fraction of what people encounter on Facebook, we wanted to design a treatment that would have a larger effect on the user experience.
“Our treatment didn’t make attitudes less extreme, nor did it make attitudes related to affective polarization any less severe,” Settle continued. “Interestingly, one of the secondary findings is that even though the people in our treatment group saw less content from like-minded sources, when they did see that content, they were more likely to engage and interact with it. I think this shows that you can’t use algorithms alone to entirely override the psychological predispositions that people have.”
No simple fix
This finding reinforces the takeaway from Settle’s 2018 book “Frenemies: How Social Media Polarizes America.” In it, Settle contends that any solution to polarization must take into account both algorithms and a person’s psychological disposition. The book recently won a prestigious award from the American Political Science Association recognizing it as the best book published at least five years ago on the topic of elections, political behavior, and voting.
“I think this study really speaks to my previous work in that we were able to work with Meta to do this really important intervention on the algorithm and alter the experience that people had on the site, but that alone is not enough,” Settle said. “It’s not a simple fix. We’ll have to push further as we brainstorm and think about other potential solutions to polarization.”
Settle said more studies related to this project are expected to be published over the next several months. She was a co-lead author on two more papers – one about behavioral polarization on social media, such as friending and defriending or decisions to connect or disconnect from groups, and the other about content from untrustworthy sources.
Good science takes time, Settle said, and she is confident in the integrity of the process used in the study.
“We hope that this can serve as a model and launching point for future collaborations, not just with Meta but other tech companies as well,” Settle said. “It’ll be up to them to decide how they want to proceed moving forward, but we’ve learned a lot from this collaboration that should be useful for the process of science moving forward.”
Nathan Warters, Communications Specialist