In remarkably short order, over just the last 25 years, the successive arrivals of the broadband internet, social media, and the smartphone have changed the way we live. The constant, enveloping presence of digital media has given a new texture and tempo to our days, and it has dramatically altered the way we inform ourselves and converse with others. Not since the spread of electrification a century ago have we seen a technological phenomenon with such far-reaching personal and social consequences.
The entire edifice of digital media has been constructed on two assumptions. The first conflates information and knowledge: if we give people more information more quickly, they will become smarter, better informed and broader minded. The second conflates communication and community: if we provide people with more ways to share their thoughts, they’ll become more understanding and empathetic, and society will end up more harmonious. Deeply idealistic, the two assumptions have been fundamental to the ideology and business strategy of Silicon Valley – they explain much about the way online experience has evolved – and by a sort of cultural osmosis, they have also come to be broadly held by the general public.
The only problem is, both assumptions are false. We are now, as individuals and as societies, paying the price for being seduced by a pair of utopian myths.
Let’s look first at the confusion of information with knowledge. It’s easy to understand the expectation that more information would inevitably lead to more knowledge. Information is, after all, the raw material out of which personal knowledge is formed. The more of it that is available to us, the more our minds have to work with. But when it comes to evaluating the cognitive and intellectual effects of an informational medium, we need to consider not just how much information the medium supplies but also the way the medium supplies it. How information is delivered to us has a profound influence on our brain’s ability to transform that information into true knowledge.
Information becomes knowledge only when we transfer it from short-term memory (the mind’s notepad) to long-term memory (the mind’s filing system). Through this complex process, which brain scientists call memory consolidation, a new piece of information gets connected to all the other information we store in our heads. It’s these connections, or associations, between pieces of information, not the individual pieces themselves, that give depth to our thoughts. The connections form the essence of our intellect, enabling us to think conceptually and critically, to solve difficult and unexpected problems, and to make leaps of inference and imagination. The richer the web of connections, the sharper the mind.
Memory consolidation is a fragile process. It demands, brain science makes clear, attentiveness. If we’re distracted or interrupted while taking in new information, the mind struggles to weave it into our store of knowledge. Either we forget the information entirely, or we connect it only weakly to the other things we know. A smartphone, as anyone who owns one knows, is a distraction machine. Through its constant stream of messages, alerts and notifications, it dispenses a welter of information, usually in short, overlapping bits. The barrage of information breaks our concentration and fragments our attention. The distractions are particularly pronounced with social media apps like Facebook, Twitter and Snapchat, which are carefully designed to encourage continuous information snacking and to discourage any sustained mental focus.
The paradox of digital media is that, even as it provides us with broader and faster access to information than we’ve ever had in the past, it dispenses the information in ways that impede the fundamental brain processes required to build personal knowledge. More information, we’re now learning, can actually lead to less knowledge.
A growing body of scientific evidence reveals the debilitating cognitive effects of digital media. In one seminal study conducted nearly ten years ago, researchers at Stanford University gave a battery of basic cognitive tests to two different groups of people: one group spent a lot of time online; the other used digital media only occasionally. The heavy users performed significantly worse on all the tests. They were more easily distracted, had less control over their attention, and were much less able to distinguish important information from trivia. ‘Everything distracts them,’ observed Clifford Nass, the professor who led the study.
The Stanford study was performed when people still used laptops and desktops as their main devices for going online. More recent studies have examined the effects of smartphones on cognition. They paint an even darker picture. In one study, published in 2017, researchers from the University of Texas and the University of California recruited more than 500 test subjects and gave them two standard tests of intelligence. One test gauged ‘working memory capacity’, a measure of how fully a person’s mind can focus on a particular task. The second assessed ‘fluid intelligence’, a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the participants placed their phones in front of them on their desks; others stowed their phones in their pockets or handbags; still others left their phones in a different room.
The results were striking. In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. Those who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased. A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily people rely on their phones in their everyday lives, the greater the cognitive penalty they suffer.
In an article on the research in an academic journal, the scholars wrote that the ‘integration of smartphones into daily life’ seems to cause a ‘brain drain’ that weakens such vital mental skills as ‘learning, logical reasoning, abstract thought, problem solving, and creativity’. Smartphones have become so entwined with our day-to-day existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting valuable cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking. The fact that most of us now habitually keep our phones ‘nearby and in sight,’ the researchers noted, only magnifies the mental toll.
The findings are in line with other recently published research. In a similar but smaller 2014 study, psychologists at the University of Southern Maine found that people who had their phones in view, albeit turned off, during two demanding tests of attention and cognition made significantly more errors than did a control group whose phones remained out of sight.
In another study, published in 2017, researchers examined how smart- phones affected learning in a large lecture class at the University of Arkansas. They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly. A recent study of nearly 100 secondary schools in Britain found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
The multifunctional smartphone, with all its intoxicating streams of social information, has become such a powerful mental attractant that it distracts us whenever it’s nearby – whether we’re using it or not. Even when stowed in a pocket, it disrupts the mind’s ability to turn information into knowledge.
It’s useful to compare digital media with an earlier form of information delivery, one that until the rise of networked computers was the dominant informational medium: the printed page. A printed page of text serves, literally as well as figuratively, as a shield against distraction. Because nothing competes with the words, the printed page focuses the mind, in effect training us to be more attentive. It provides an aid to memory consolidation and, in turn, to knowledge development. The text on a printed page may be the same as that on a smartphone screen, but the intellectual effects of reading the text could hardly be more different.
Now let’s consider the second assumption – the one that confuses communication with community. This myth has a long tradition in contemporary Western thought. Ever since the building of the telegraph system in the 19th century, people have believed that advances in communication technology would promote social harmony. The more we learned about each other, the more we would recognise that we’re all one. Community and communication would advance hand in hand, like a pair of hippies on a San Francisco sidewalk. A New York Times columnist, in an 1899 article celebrating the laying of transatlantic Western Union cables, expressed the popular assumption well: ‘Nothing so fosters and promotes a mutual understanding and a community of sentiment and interests’, he wrote, ‘as cheap, speedy, and convenient communication.’
The great networks of the 20th century – radio, telephone, television – reinforced this sunny notion. Spanning borders and erasing distances, they shrank the planet. Guglielmo Marconi declared in 1912 that his invention of radio would ‘make war impossible, because it will make war ridiculous’. AT&T’s top engineer, JJ Carty, predicted in a 1923 interview that the telephone system would ‘join all the peoples of the earth in one brotherhood’. The world wars and other depravities of the 20th century did little to dampen the general faith in the benevolence of communication networks.
Social media companies embraced the myth wholeheartedly, using it to portray their businesses as socially salubrious. Six years ago, as Facebook was preparing for its initial public offering, Mark Zuckerberg, the company’s founder and CEO, wrote a letter to would-be shareholders in which he explained that his company had a higher calling than just making money. Facebook was pursuing a ‘social mission’ to encourage self-expression and dialogue among the masses. ‘People sharing more’, the young entrepreneur wrote, ‘creates a more open culture and leads to a better understanding of the lives and perspectives of others.’
If our assumption that communication technology brings people together were true, we should today be seeing a planetary outbreak of peace, love, and understanding. Thanks to the internet and cellular networks, humanity is more connected than ever. Of the world’s seven billion people, six billion have access to a mobile phone. That’s a billion and a half more, the United Nations reports, than have access to a working toilet. More than two billion persons are on Facebook, more than a billion upload and download YouTube videos, and billions more converse through messaging apps like WhatsApp and WeChat.
Yet we live in a fractious time, defined not by concord but by conflict. Xenophobia and authoritarianism are on the rise. Political and social fissures are widening. From the White House down, public discourse is characterised by vitriol and insult. As for Facebook and other social networks, they have been revealed to serve as conduits for propaganda and hate speech. Far from bringing us together, they seem more likely to polarise us.
We shouldn’t be surprised. For years now, psychological and sociological studies have been casting doubt on the idea that communication dissolves differences. The research suggests, in fact, that the opposite is true: free-flowing information makes personal and cultural differences more salient. It tends to turn people against one another instead of bringing them together. ‘Familiarity breeds contempt’ is one of the gloomiest of English proverbs. It is also, the evidence indicates, one of the truest.
In a series of experiments reported in the Journal of Personality and Social Psychology in 2007, three Harvard psychologists found that the more we learn about someone else, the more we tend to dislike that person. ‘Although people believe that knowing leads to liking,’ the researchers wrote, ‘knowing more means liking less.’ Worse yet, they found evidence of what they termed ‘dissimilarity cascades’. As we get additional information about others, we place greater stress on the ways those people differ from us than on the ways they resemble us, and this inclination to emphasise dissimilarities over similarities strengthens as the amount of information accumulates. On average, we like strangers best when we know the least about them.
An earlier study, published in 1976, revealed a similar pattern in real-world communities. Scholars from the University of California at San Diego studied a condominium development near Los Angeles, charting relationships among neighbours. They discovered that as people live more closely together, the likelihood that they’ll become friends goes up, but the likelihood that they’ll become enemies goes up even more. The scholars traced the phenomenon to what they called ‘environmental spoiling’. The nearer we get to others, the harder it becomes to avoid evidence of their irritating tics and habits. Proximity makes differences stand out even more than similarities.
The effect intensifies in the virtual world, where everyone is in every- one else’s business all day long. Social networks like Facebook and messaging apps like Twitter encourage constant self-disclosure. Because status is measured quantitatively online, in numbers of followers and friends, retweets and likes, people are rewarded for broadcasting endless details about their lives and thoughts through messages, updates and photographs. To shut up, even briefly, is to disappear. One study found that people share four times as much information about themselves when they converse through computers as when they talk in person.
Being exposed to this superabundance of personal information can create an oppressive sense of ‘digital crowding’, a group of UK researchers wrote in a 2011 paper, and that in turn can breed stress and provoke antisocial reactions. ‘With the advent of social media’, they concluded, ‘it is inevitable that we will end up knowing more about people, and also more likely that we end up disliking them because of it.’ It turns out that the old etiquette books were right: the less we talk about ourselves, the more likely other people will enjoy our presence.
In his 1962 book The Gutenberg Galaxy, the celebrated media theorist Marshall McLuhan gave us the memorable term ‘global village’ to describe what he called the world’s ‘new electronic interdependence’. Most people took the phrase optimistically, as a prophecy of inevitable social progress. What, after all, could be nicer than a village? But, despite his occasional utopian rhetoric, McLuhan himself harboured few illusions about life in a global village. He saw villages as inherently tribal, marked by mistrust and friction and prone to viciousness and violence. ‘When people get close together, they get more and more savage and impatient with each other,’ he said in a 1977 television interview. ‘The global village is a place of very arduous interfaces and very abrasive situations.’ That’s a pretty good description of where we find ourselves today.
The problem with Silicon Valley’s two formative myths goes beyond their denial of human nature. They reinforce the idea, particularly prevalent in American culture, that technological progress is sufficient to ensure social progress. If we get the engineering right, our better angels will triumph.
That’s a pleasant thought, but it’s a fantasy. Progress toward a more amicable world and a more broad-minded populace will require not technological magic but concrete, painstaking, and altogether human measures: negotiation and compromise, a renewed emphasis on civics and reasoned debate, a citizenry able to appreciate contrary perspectives and to think deeply about complex challenges. It will require less selfexpression and more self-examination.
Technology is an amplifier. It magnifies our best traits, and it magnifies our worst. What it doesn’t do is make us better people. That’s a job we can’t offload on machines or hand over to technologists.
This essay originally appeared under the title ‘Nasty, Brutish and Dim: Online Life Reconsidered’ in ‘Knowledge and Information: Perspectives from the Engelsberg Seminar’, Bokförlaget Stolpe, in collaboration with Axel and Margaret Ax:son Johnson Foundation, 2018.