Wikipedia is one of the hubs of internet knowledge. The online encyclopedia receives more than 18 billion page views every month, which means that it exerts a huge influence on the way we think about our world and how we discuss every conceivable political and social issue.

Incredibly, Wikipedia is entirely curated by a team of volunteer users who come from many different backgrounds — and, unsurprisingly, the pages have become sites of ideological warfare, with different sides trying to control the terms of debate.

Is that conflict productive? Does the input of a politically diverse group of curators and editors result in more biased information — or less? How do we know which Wikipedia articles are comprehensive and balanced and which are not?

You might expect that people who already agree with each other are able to work cohesively to create more accurate pages. But according to a new study, that’s not true. It finds that Wikipedia’s political, social issues and science pages edited by a more politically diverse segment of editors ended up being higher quality.

The example of Wikipedia suggests that intellectual diversity can help teams produce higher-quality work, thanks to the different perspectives brought to bear on shared goals. When we lack intellectual diversity, it makes it much easier for us to develop blind spots or even discriminate against colleagues with different views. There are online projects — like Wikipedia or the news site AllSides — that are trying to develop healthy practices around intellectual diversity in a society where that’s increasingly rare. These projects are becoming laboratories where scientists can test the conditions that foster exchange across political differences.

Wikipedia has internal quality metrics like comprehensiveness and readability that editors use to rate articles and allocate resources. “We kind of piggybacked on their efforts and use this tool for our purposes,” says Misha Teplitskiy, one of the researchers who performed the study and a postdoctoral fellow at the Laboratory for Innovation Science at Harvard University.

After taking account of Wikipedia’s own measure of quality, Teplitskiy and his colleagues measured the ideological alignments of individual Wikipedia editors by the number of contributions they made to either conservative or liberal articles. In fact, they found that the structure of Wikipedia necessitated cross-ideological collaboration.

“If you want to edit whatever controversial topic, you really are forced to interact with whoever is already editing that topic,” says Teplitskiy. “You can’t just start your own Wikipedia page for the conservative take on climate change or something, right? There’s already a climate change page. You’re gonna have to play with the folks that are already in that sandbox. And that’s, of course, very different from Twitter, etc., where there’s no limit, there’s no forcing mechanism to get people into one room for a particular topic.”

“You can’t just start your own Wikipedia page for the conservative take on climate change. There’s already a climate change page. You’re gonna have to play with the folks that are already in that sandbox.”

But does this approach result in better pages? Yes. They found that more ideologically polarized teams of editors outperformed the homogenous ones, according to Wikipedia’s internal metrics. Their paper notes that the effect was greatest for political articles but also was seen in the other two issue areas. As polarization among editors rose, the odds of moving from lower- to higher-quality categories increased by a factor of almost 19 for political articles, and by about two for both social issues and science articles.

This isn’t the first study to find that heterogeneous groups can often come up with better solutions than homogenous groups. A recent study of professional video game teams found that the more culturally diverse groups won more prize money. Diverse crowds often out-guess individuals. In the influential 2004 book The Wisdom of Crowds, James Surowiecki describes many studies and examples of groups solving problems better than individuals. For example, contestants on the game show Who Wants to Be a Millionaire? were more likely to win when they asked the audience to guess the weight of an ox, as opposed to phoning a single trusted friend.

As these thinkers argue, political and intellectual diversity can benefit a team by helping team members see flaws in their own ideas: If everybody agrees with you, it’s hard to see what you could possibly be missing. And lacking that diversity — just like lacking diversity in other dimensions — could create environments where discrimination occurs.

While Wikipedia might have embraced this kind of diversity, several studies have found that other institutions are moving in precisely the opposite direction.

In 2012, researchers studied political diversity in the field of social and personality psychology. They surveyed a broad range of social and personality psychologists on various issues to figure out where they stand politically and how they would interact with colleagues who share different politics.

They found that while respondents had varied positions on the issues, they overwhelmingly identified as left-leaning. This is important because much of our polarization is based on self-identification with labels, rather than what we actually believe.

Indeed, the researchers found that in this overwhelmingly liberal environment, there was a high willingness to discriminate against conservatives. More than one in three respondents said they would discriminate against conservatives when making hiring decisions; one in four said they would discriminate in reviewing their grant applications.

More recently, NYU psychologist Jonathan Haidt has noted that the political identifications of those who work in academia have become much more one-sided than they were in the recent past. In his recent book, The Coddling of the American Mind, he and co-author Greg Lukianoff argue that in the early 1990s liberal professors outnumbered conservatives two-to-one, but that this ratio has shifted closer to five-to-one in recent years, with certain departments being even more lopsided.

Perhaps as a result, the general public has come to view higher education in a more polarized way, with liberals consistently approving of these institutions more than conservatives do. This trend risks undermining those institutions, as both sides start to evaluate them by political leaning rather than merit.

As the study of social psychology suggests, this can increase the risk of acts of discrimination against conservatives on campus, but also against institutions that conservatives may implicitly perceive as liberal.

In addition, we know that calamities have occurred when teams excluded or marginalized dissenting voices who could have pointed out the errors of their processes and conclusions. Examples range from Commodity Futures Trading Commission chair Brooksley Born trying to warn regulators about looming financial risk prior to the Great Recession to the intellectual consensus around the Iraq war, which was formed by a news media that mostly excluded anti-war voices.

“Filter bubbles in the mainstream media are one of the most dangerous aspects of the modern age because they reinforce the least risky positions.”

In 2017, Germany’s Goethe Institute commissioned the Jerusalem-based journalist Antony Loewenstein to discuss the problem of ideological silos. “Filter bubbles in the mainstream media are one of the most dangerous aspects of the modern age because they reinforce the least risky positions,” he says.

Promoting a healthier and competitive media is one way to promote intellectual diversity, but we also need to be mindful about doing so in our everyday lives, in our choice of media. Today, the website AllSides works to help people broaden their minds and consider the news and politics from points of view other than their own. Its founder, John Gable, was a long-time Republican Party staffer who later went to work on the internet browser Netscape.

“We all had this great thought that the internet would help us connect with each other,” he says of his Netscape days. But he was worried that the internet would also “train us to discriminate against each other in new ways.” In retrospect, his concern was justified: Two decades later, we can see that filter bubbles and tribal clustering are a real problem in online spaces.

“It trains us to think in categories,” says Gable. “If you and I disagree on the environment, let’s say, it’s not just that you are wrong, but you must be evil because you’re one of them.”

In response, Gable worked with Joan Blades, a co-founder of the liberal organization MoveOn.org, to start AllSides seven years ago.

AllSides offers its readers a menu of news items alongside coverage from the left, right and center. If, for instance, readers see a story about a congressional testimony, the website allows them to see the same event written up from multiple points of view.

It also features what’s called a “Red Blue dictionary,” where a reader can click on a topic to see different views on the same terminology. For instance, the page on abortion explains how pro-choice Americans view abortion as a common medical procedure, while pro-life Americans view it as taking a human life. The pages often include discussion questions to facilitate nuanced political thinking about the topic.

“The internet right now . . . tends to only have us see people who are just like us, and it always tends to show us information that we already agree with. When we always see people just like us, and when we always hear opinions we agree with, two things happen to any of us. First, we become much much more extreme in our points of view. And, secondly, we become much less tolerant of any person or idea that’s different than us. And that’s ripping up the fabric of our society.”

Fortunately, projects like Wikipedia and AllSides suggest an alternative. At Wikipedia, opponents can come together in a structured way to accomplish a shared goal: accurate, readable pages, evaluated according to agreed-upon criteria. For its part, AllSides primes readers to see the value of many points of view. Of course, we tend to find it more soothing when everyone agrees with us. The path these projects lay out is the more troubling, stressful one. Is it worth it? This research to date suggests that the answer is yes.

A version of this story originally appeared on Greater Good, the online magazine published by UC Berkeley’s Greater Good Science Center.

THIS Q+A IS PUBLISHED AS PART OF AN ONGOING SERIES INTERVIEWS WITH MEMBERS OF STANFORD’S POLARIZATION AND SOCIAL CHANGE LAB

Those of us old enough to remember the dawn of social media might recall a lot of happy talk about it representing a new way to connect with the world. So much for that. Platforms that promised connection have instead often exacerbated division through interfaces and algorithmic designs that reward provocation and bomb throwing. But it doesn’t have to be this way. Just a few weeks ago, we featured a story on a Taiwanese social media platform that elevates consensus and brings people closer together. Other new platforms are attempting to do something similar. 

One is called Gell. It’s an online forum designed to encourage users to fully contemplate an issue before hurling their opinion out into the world. Created by a group of entrepreneurs, philanthropists and technologists looking to encourage informed discourse, in its own words, it “bring[s] together a diverse group of thought leaders and engaged citizens to encourage, facilitate, and moderate healthy discussions and debates on the most important issues.”

In this interview, James Chu and Nick Stagnaro, two researchers at Stanford University’s Polarization and Social Change Lab, discuss social media’s promises and failures, and what their research says about the potential for Gell to successfully course correct.

This interview has been edited and condensed for clarity. 

Considering the political moment in which we find ourselves, how has technology affected how people organize themselves into teams?

James Chu: There was a time when we thought the rise of the internet and mass media would be extremely salubrious to helping the country become better at being a democracy, or at least becoming a more thoughtful republic. People had opinions that were more deeply informed because they had more access to information. Even if they were debating strictly moral claims that weren’t necessarily informed by data, at least they were able to be thoughtful and potentially even more empathetic. And people who were originally on the outskirts could get a fair shot of having their voices heard. The verdict is not completely out yet on whether mass media has been good or bad. 

James Chu

But clearly social media has a polarization problem — it fosters these environments where hyper-partisan people seem to thrive. Is technology creating the behavior or just unlocking something that’s already there?

Nick Stagnaro: There’s a confluence of factors in the context of social media, where you have aspects of anonymity and a lot of performative features — you’re having a conversation in front of a huge audience of people — which constrains nuance and the ability to have conversations.

But I don’t think any of this stuff is unique to social media. It’s the reason reality television does very well. It’s really sticky, it reproduces at high rates, and it becomes highly permeating. That’s why more complicated, nuanced shows or online social platforms or conversations just don’t do as well. I feel like the question isn’t really quite right. It’s sort of like, what are the right ways of maybe regulating and controlling, and how do we establish good norms?

So let’s talk about how to do that — tell us about Gell and how it functions differently from other online forums?

NS: Gell is trying to present information on contentious arguments or topics by inviting people from different sides of the aisle who have cultural or social authority to comment on some sort of topic. They write content, and other people come in and read these different points, and then add their own commentary or respond specifically to an argument. This leads people to have conversations and consume different arguments, which oftentimes results in a more balanced but informed position.

“I do think there’s a lot of people out there who’ve seen this problem and are rising up to try to solve it. That’s cause for optimism.”

Can you give us an example?

NS: Say I have a position about affirmative action that isn’t well-informed. There are websites I can visit that will adhere to one side. But Gell might encourage me to sample more broadly and to build a better understanding of arguments for and against it, [to hear from] people who hold those two positions, and to expand my understanding. You can imagine, even in the context where I don’t change my position, I’m now better-informed and also potentially more willing to move in the direction of accuracy as I navigate forward in life.

JC: If you look at our politics today, it seems at first glance there isn’t much to be happy about. We thought social media was going to be this harbinger of better debates and more enlightened political discussion, and so far the evidence suggests that it’s not. But I do think there’s a lot of people out there who’ve seen this problem and are rising up to try to solve it. That’s cause for optimism. The way Gell displays information and helps you to understand both sides could very well lead to much more enlightened and thoughtful debate.

So your research is looking at people’s behavior after they’ve visited Gell. What have you found?

JC: Because of the way our experiments are designed, we give people different articles to read or different sources to look at, and you can actually measure whether people look at more types of information after they were exposed to Gell, versus people in the control group that didn’t get exposed to Gell. Gell actually causes people to be much more diverse in their media consumption. They have a much more omnivorous diet and they’re not just consuming information from one source.

Nick Stagnaro

It’s not clear that we can claim numerous media diets lead people to have more thoughtful opinions about politics or to be more polite or civil, but we do think the basic principle of looking across more sources of information is a good and important outcome.

So this isn’t really about changing people’s opinions — it’s more about giving them access to a wider range of them.

JC: What we want to see is that [the site] leaves people willing to sample from more spaces. We’re not asking for conservatives to become liberals or liberals to become conservatives. That’s silly. What we’re trying to minimize is the affective polarization, so that people stop hating each other, have more constructive conversations, and sort of understand why the other side disagrees with them so that the debates are higher quality rather than just, “They’re dumb, they’re evil, they’re stupid,” which of course doesn’t do anybody any good in the long run. 

Since inflammatory discourse is profitable for social media companies, what could encourage them to adopt practices more like this? And what if they don’t?

JC: The fear is that we throw the baby out with the bathwater after we see a couple of negative effects and assume that the whole thing is corrupted to the core. Clearly, social media has had enormous benefits that also deserve to be quantified and brought to the forefront. One really important thing to remember in both our work and hopefully for the work of our colleagues is that you can’t just scapegoat social media companies as being bad. They have a lot of positive effects and it’s important that you don’t create interventions that do more harm than good.