You are not as clever as you think

The history of man is a story of mimicry and copying rather than innovation and 'light-bulb moments'.

The great American inventor Thomas Edison is surrounded by his creations
The great American inventor Thomas Edison is surrounded by his creations. (Photo by Buyenlarge/Getty Images)

Human beings are the only species with a history. In the brief span of our existence (roughly the last 200,000 years of the 3.75 billion-year history of life on Earth), we have gone from being a species that made just a few simple hand-axes to one that now enjoys soaring cathedrals and breathtaking art, that can make spaceships, smart phones and driverless cars, and which is capable, Prometheus-like, of manipulating its own genetic code. Contrast this with our wretched cousins the chimpanzees who, while sharing with us over 98 per cent identity in the sequences of our genes, still sit on the forest floor cracking nuts with the same old stones as they have for millions of years. In fact, for all species but our own, history really is, as the English historian Arnold Toynbee once remarked ‘just one damn thing after another’, and in the case of chimpanzees the same one thing at that.

Why this difference? Why do humans accumulate ideas, knowledge and technologies while the other animals are seemingly stuck doing the same thing over and over, never really getting any better? The chimpanzees tell us that the answer must be small in genetic terms but at the same time capable of opening up an unbridgeable gap in our evolutionary potentials.

It is a particular human conceit that the usual answer we give to this question is that we are simply cleverer than the other animals; that if we just think hard about something, we will eventually figure it out. Call this the ‘insight’ view of our abilities at innovation. It predicts that our raw wit and ingenuity will produce great leaps in the technological timeline, recording moments of innovation when humanity jumps up a step – or even two – on the technological ladder. It is the view of creativity captured by the ‘thought-bubble’ image of a lightbulb flashing on inside your head.

Of course, we are cleverer than the other animals, but the history of technological progress doesn’t include many lightbulbs. The historian George Basalla has observed that the romantic notion of technology advancing owing to the efforts of a small number of heroic inventors, whose insights produce revolutionary leaps with little or no link to the past, struggles to connect with what we observe when we look closely at their deeds. Throughout our history, innovation has turned out to be more a case of trial and error, copying others and chance good fortune than of searing insight.

Thomas Edison, usually credited with inventing the lightbulb, toiled over a two-year period to find a filament that would glow brightly for more than a few seconds. Edison’s journals record that he tested ‘no fewer than 6,000 vegetable growths, and ransacked the world for the most suitable filament material’ before finally settling, quite by accident, on a carbonised cotton thread he happened to be rolling between his fingers. James Watt is often credited with inventing the steam engine, but his design was a modification of an engine that Thomas Newcomen had developed. Children learn at school that Henry Ford created the assembly line, but he didn’t; he refined existing production methods.

In more modern times we credit the late Steve Jobs as the creative genius behind the Apple brand. Jobs achieved notoriety for the mouse-driven point-and-click operating system that revolutionised how we use computers. But rumours persist that Jobs took the idea from the Xerox corporation – whose premises he had toured – which had developed that operating system years before. Jobs’ genius was that, perhaps unlike the Xerox corporation, he recognised the value of this particular innovation.

Even one of the most revolutionary of scientists, Isaac Newton, remarked that he ‘stood on the shoulders of giants’. But we need not confine ourselves to the sciences and technology to be disappointed about our innovative prowess. Damien Hirst, the enfant terrible of the Young British Artists movement which arose in the late 1980s and endured well into the 21st century, has recently proclaimed that ‘I spot good ideas and steal them’. Cynics point out that Hirst’s habit extends beyond his actual ‘spot paintings’, and in keeping with this sentiment The Times (of London) asserts that ‘the multimillionaire artist has admitted that all his ideas are stolen.’ The sculptor Auguste Rodin allowed that ‘in my spare time I simply haunt the British Museum.’ He called it the ‘temple of muses’, taking particular inspiration from the 5th century BC-sculptor Phidias, whose efforts are thought to have included the remarkable Parthenon sculptures that the British Museum prefers to call the Elgin marbles.

There is, then, a peculiar and cruel irony in a species that refers to itself as Homo sapiens, or the wise man. Innovation is hard and most of us if we are honest with ourselves are not very good at it. Think of things that have made a difference in the history of the world – the first hand-axe, the first spear, the first bow and arrow, the first fishhook or basket – and now ask yourself how many comparable ideas you have had. Or perhaps that sets the bar too high. Ask yourself how many ideas you have had that influenced others, something you did that others wished to copy, or that perhaps attracted a patent? Few of us can say we have invented much that has really made a difference to someone else.

Successful inventors and entrepreneurs are rare, and efforts to find them in reality television shows or produce them in the classroom seldom yield results. Our awareness of the value of ideas is illustrated in our reluctance to share them, whether they are old family recipes, knowledge of fishing lures, or scientific or business innovations, but also in the existence of our many patents and copyrights, in the prevalence of business espionage and industrial theft, and even in the insatiable appetites of companies for acquiring each other: it is often easier to buy or steal someone’s intellectual property than to create it yourself.

Today most of us live in a world full of objects we don’t understand and that are of such utter complexity that no one knows how to build them: objects like smartphones, spaceships and self-driving cars – even a single computer chip – rely on teams of people for their design, maintenance and production. Most of us are also confronted daily with questions we don’t know the answers to: which car is best, should I buy that house, what mortgage product should I get? Which pension scheme is best for me?

How did we get here? That is to ask, despite our evident limitations at innovation, how have we managed to accumulate technology throughout our history, technology that has transformed how we lead our lives? The answer probably lies in a new form of evolution that humans introduced to the world when we arose as a species around 200,000 years ago. Until that time all evolution had depended upon genes, that is, upon parents handing down copies of their genes to their offspring. But humans introduced a second great form of evolution that works in parallel to but much faster than genes: the world of ideas. And that new form of evolution turned around and sculpted us, making us good at copying, even if not so good at innovation.

Here is why. Introducing ideas was a true form of evolution because, among human beings, a new idea can arise and if it is a good idea, it can be transferred quickly from one mind to another until everyone adopts it. This is why idea evolution is much faster than genetic evolution: whereas new genes can only be passed vertically from parents to offspring, ideas can leap ‘horizontally’ from mind to mind. You are born and must live your entire life with a single set of genes, but your mind can sample from a sea of evolving ideas throughout your lifetime.

The ability to choose among ideas, retaining the good ones and discarding the bad ones, is sometimes called our capacity for culture, and it is what really makes us human: surprising as it might seem, only our species seems to have the capacity to understand others’ intentions and then choose to copy the best of their ideas, objects or actions. If I watch you and someone else making a fishing lure and then notice that one of the lures works better than the other, I will copy the better one. Other animals don’t do this. And it is this small difference between us and the rest of the animals that created the unbridgeable gap in our evolutionary potentials: cultural – that is, idea – evolution took care of the rest as good ideas spread and accumulated, one on top of the other.

Copying good ideas makes us sound intelligent, but there is a twist: if I can observe other people’s innovations, I don’t need to be innovative myself. I can simply take my pick of their best ideas rather than attempting to create something on my own. For instance, if I am trying to make a better spear or hand axe, I could make lots of different shapes and sizes, until I figure out by trial and error which one works well. On the other hand, if I notice that somebody else has made a good spear, I can simply copy it. Plus, the time and energy I save in copying someone else’s idea rather than trying to come up with my own, might mean that I get to kill that moose or mammoth, or catch that fish, before they do.

Our capacity for culture leads to a startling conclusion: humans’ capability to recognise a good outcome when they see it, and to copy how that outcome was obtained, can produce technological progress without our species having any insight at all. If we can survey others, if we can sift through a range of others’ attempts at innovation, we will eventually find something that works, and this will be true even if these other attempts are, Edison-like, no better than random.

This is a long way from being the ‘wise man’. On the other hand, we might expect in practice that groups with a number of innovators in them would be able to outcompete groups lacking them, and so at least some of us will be innovators. But how many of us? Here again the answer is modest: intuition tells us that the number of innovators can be small because, in any given group, the rest of us can simply copy, steal or plagiarise their works. And, recall from our example of the spear, copying others can often be a better strategy than attempting to innovate on your own.

We can observe our tendency to be copiers rather than innovators whenever we are confronted with situations or questions we don’t know how to address. In those circumstances we typically do what others do. So, the answer to what pension product to get, or what mortgage to buy, which dishwasher is best or how to escape from a fire in a large building, is that we often just do what a majority of others do. And there is even a logic in this. In a harsh world, if an idea has survived long enough that large numbers of people are using it, it is probably a reasonable idea.

In most aspects of our lives, then, it could be said that most of us are little more than glorified karaoke singers, or ardent followers of ‘likes’. Our tendency to be copiers rather than innovators is why ideas and technologies accumulate gradually rather than in great leaps. In this regard, cultural evolution is surprisingly like genetic evolution. It has taken around a billion years of genetic evolution to move from single-cell organisms like yeast to the almost unimaginably complex multicellular organisms that we are. Genetic evolution has no foresight. It bumbles along by randomly producing a variety of forms and then relying on natural selection – survival of the fittest – to sift among the alternatives, retaining the best ones. As a consequence, like cultural evolution, the history of life also shows few big leaps.

We live at a curious time in the history of the world, when there are people living a Stone-Age existence in the depths of the Amazon rainforest – people who have little or no technology beyond bows and arrows, rudimentary clothing and simple shelters – at the same time as there are people wandering around urban city centres enjoying technologies the rainforest people would regard as magical and bewildering.

Why are these people so different to the city dwellers? It is not that they are less intelligent, imaginative or innovative: an infant plucked from one of these uncontacted peoples and brought up in modern society would be no different to you or me. The differences between ‘us’ and ‘them’ come down to the rainforest people not having much to copy. These uncontacted people have lived isolated from the rest of the world. Small groups of hunter-gatherers living in a single area simply do not produce enough new ideas to propel the cumulative cultural juggernaut most of the rest of us (and that surely includes anyone reading this essay) enjoy. If ever there was a demonstration of the power of idea evolution and our species’ reliance on others for ideas to copy, these Stone Age people provide it.

There is reason to believe that throughout our history prosperity has been linked to networks of connections among peoples. Palaeontological evidence tells us that 100,000 to 120,000 years ago, someone in present day Algeria wore a decorative necklace of seashells. This alone might not be surprising except that this person lived 120 miles inland. Someone about 25 miles from the sea in Morocco did the same around 80,000 years ago. Both of these people would have been what we might today call influencers. Both would have been at the height of fashion and probably enjoyed the envy of others. And both of these people had their fashion objects because of someone else’s idea and because of networks of trade. Historians sometimes remark that the Mediterranean was, in classical times, the world’s first internet. Classical Greek and Roman societies might have owed at least some of their prosperity to their extensive trading links with the societies of the Mediterranean and beyond.

In our modern world, the pace of technological change is accelerating vertiginously. Just in the ten years running up to 2017, at least ten new technologies appeared that would have been fanciful a decade before: smart phones, social media such as Twitter, electronic book readers, household assistants such as Alexa and Echo, self-driving cars, consumer space travel, digital map applications, virtual reality, the so-called ‘cloud’ and 3D printers. Disruptive technologies such as digital cameras, taxi-hailing services and electronic currency transfer schemes have driven older technologies to extinction or nearly so.

Is this modern explosion of technology an indication that we have become smarter? Probably not. Ideas beget more ideas simply because the more there are, the more different ways there are for them to be combined to make new things. The same has been true of biological evolution: as life became more complex, natural selection had more to work with, and it has created a dizzying variety of species. But there is another reason for the acceleration of technological change. We are more connected than ever before, meaning there are more ideas floating around than ever before. Maps of Facebook friendship connections when plotted by their latitude and longitude draw a startlingly good map of the world. Airline routes plotted from take-off to destination do the same. And so do plots of international scientific collaborations.

All this connectivity means that we have access almost instantly to around seven billion other minds. It might have come at just the time we most need it. We are entering an era – the so-called Anthropocene, the era of human influence on the world – that will call for an even greater pace of innovation to keep up with developments such as climate change, resource depletion, overpopulation, ageing societies and the spread of resistant pathogens. We have to hope that the sharing of ideas that brought us to this point, might just get us out of these predicaments. If we really aren’t as clever as we would like to think, we have no good alternative.

This essay by Mark Pagel was originally published in Knowledge and Information: Perspectives from the Engelsberg Seminar, Axess Publishing, 2018

Author

Mark Pagel