Fist

Social media: the promise to unite, the power to divide

Wael Ghonim helped spark a revolution that toppled Egyptian President Hosni Mubarak. This was 2011, so unlike, say, Che Guevara, Ghonim had access to social media to help organize people and protests. “Social media was crucial for this campaign,” he said. “It helped a decentralized movement arise. It made people realize that they were not alone. And it made it impossible for the regime to stop it.” 

Connecting people is the stated goal of social media, or at least it was. Access to everyone you know or have ever met, right there a tap or two away. Yet over time, the promise of social media has fallen short of its once high-minded standards. Even Ghonim, six years after the Egyptian revolution, concludes: “The Arab Spring revealed social media’s greatest potential, but it also exposed its greatest shortcomings. The same tool that united us to topple dictators eventually tore us apart.” 

Social media platforms today don’t represent unification but division. Instead of bringing all of us together, it brought some of us together. Social platforms are more likely to consist of clusters of like-minded people with largely similar viewpoints and beliefs. The platforms direct our attention away from the people (and viewpoints) in the other clusters, by design exposing us to fewer of their updates and shares. What’s emerging is awareness that social media is leading us to become overconfident in our knowledge, to discredit divergent perspectives, and to make irrational decisions. We’re a long way from social media’s golden promise not too many years ago. 

The data defining social media echo chambers

In the modern economy data is power. With every online interaction, a user provides a small piece of information about their preferences and ideologies. While a single interaction may not reveal a great deal, thousands of them reveal quite a lot. The more a company learns about us, the better they become at predicting how we’ll respond to different types of information. 

Social networks want engagement and interaction. Clicks, likes, comments, and shares ensure people have their attention on the information at hand. When we don’t respond positively to some information, our psychological profile is updated, there’s a change somewhere in the algorithm, and we are shown less of that type of content in the future. When we respond positively, that type of content is reinforced. 

2016 study confirmed that these “echo chambers” exist on Facebook, and are made possible through confirmation bias and algorithm updates. “Confirmation bias helps to account for users’ decisions about whether to spread content, thus creating informational cascades within identifiable communities. At the same time, aggregation of favored information within those communities reinforces selective exposure and group polarization.”

The more we rely on social media to get our information, the more our biases will be reinforced. “This attention economy, vying for clicks, eyeballs, pushes people into very confirmatory outlets,” says Alex Krasodomski-Jones, a researcher at the Centre for the Analysis of Social Media, and who led a study that found these same political echo chambers are present on Twitter.

What happens is our social networks turn into a stream of information catered to each individual. The term “news feed” is apt, as more people use social media as a news source, the information they’re exposed to is continually customized for their tastes. It’s a self-reinforcing cycle that whisks us away from bipartisanship and into echo chambers, where everyone shares our opinion and supports our beliefs. 

Social media drives overconfidence in what we know

There is another behavioral impact caused by echo chambers: overconfidence in what we know and what we believe. When someone relies almost entirely on social for news and information, they naturally miss out on the perspectives and opinions of people occupying different ideological domains. A well-rounded understanding does not come from one-sided information.  

One study of how the Internet inflates our understanding of what we know suggests that accessing information online can cause us to become overconfident in our knowledge of it. The researchers asked participants a series of questions with some of them able to look up the answers online, followed by a second set of questions. Those that could look up information overestimated their ability to answer the following questions, even when they were unrelated, and even when their prior searches came up empty.

“We saw that people were more confident that they knew the answers—had the information in their heads—if they had access to search. It’s more like thinking you know how to fix a car if you have access to a mechanic,” says lead author Matthew Fisher in an interview with HBR. “Searching the internet is almost effortless, and it’s almost always accessible. You never face your ignorance when it’s there. Because we’re so deeply plugged into it, we misattribute the connection to knowledge to actually having the knowledge ourselves. It becomes an appendage.”

This builds upon other research into the “Google Effect” showing that when people look up answers online they tend to remember only where they found it, not what the information was. Other research into gut instincts indicates that our evaluation of truth can come from purely intuitive reactions, which value familiarity and ease of comprehension over any rigorous analysis. While these studies focus on the Internet rather than social media itself, it is on these social networks that we consume a lot of our information.

Putting all of these insights together tells us a lot about why we’re seeing the limits—and strains—of social media use. When we become sheltered and uninformed, we grow biased and overconfident in our knowledge thanks to being surrounded by confirmatory information; we either remain ignorant of opposing ideas or they appear to come from a discredited minority. Even the knowledge that validates our ideas often stays online rather than being incorporated into our memory. When it comes time to have any meaningful debate or discussion, we feel assured that we’re right even when we’re unable to support our argument.

We’re right, they’re wrong

People have always drifted towards other people who think and behave similarly. “Homophily, where we hang out with people like us, is an ancient human trait, resulting from our basic psychology. That applies to segmentation of media as well,” says Tom Stafford, a cognitive scientist at Sheffield University. Opposites, in this sense, do not attract. Social networks, while opening up a new world of communication and sharing, have inadvertently given people the opportunity to form insulated communities, and to reinforce an already problematic confirmation bias.  

Throughout most of our history we have formed small coordinated groups that depended on people sharing the same values and beliefs. Whether such cohesion can exist in groups numbered in the billions remains to be seen. What we can be sure of is that more content is going to be published every day, and a big challenge is going to be trying to get the good, truthful, objective information to spread across this network and into the attention of as many people as possible. 

Social media aided Wael Ghonim in making the Arab Spring possible by giving him a platform to organize people around an idea. Since then, people have grouped around many ideas, some of which are beneficial, others based on nonsense and erroneous assumptions. “Five years ago, I said, ‘if you want to liberate society, all you need is the Internet,’” recalled Ghonim. “Today I believe if we want to liberate society, we first need to liberate the Internet.”