Why 16 billion cortical neurons are not enough

Humanity has come quite some way in the past 200,000 years but are we really anything more than primates with a few million more neurons than our closest relatives?

Primitive stone age tools, found in the Makgadikgadi Pan in Botswana.
Primitive stone age tools, found in the Makgadikgadi Pan in Botswana. Credit: Martin Harvey via Getty Images

I am in awe of technology. As I travel the world for work, I can communicate in real time with colleagues in the United States and family in Brazil from almost anywhere through devices placed in orbit by man that relay our voices and messages. I jump on a metallic bird and soar the skies for over 100,000 miles a year, mostly sound asleep, to exchange recently acquired bits of information and newly formed knowledge with peers and the public in different countries. If I don’t speak the local language, I can ask a small handheld metal gadget for help – one that gets the information I need through waves that surround my body but are invisible to the eye. Hardly any physical effort is required of me to go places or have my voice carried over, much less my own understanding of how airplanes, satellites and antennas work, or how to put one together.

How did we, and we alone, arrive at our current state of cognitive prowess sufficient to create and accumulate such technology? The short answer, both directly and indirectly, is biology. We may not like to think of ourselves as primates, much less as animals, but that’s exactly what got us to where we are now. We are the primate species which invented a kind of cheat that allowed us to develop the brain with the most cortical neurons – while remaining just that: primates.

Neurons are the information processing units of the brain – our mental Lego blocks, as it were – and, as with Lego, the more the blocks that are available, the more complex and intricate the assembled building can be. Not all neurons are made the same; their function depends on their pattern of connections, which defines from where they get their information, and to where they send the result of their receiving, combining and processing that information. Some neurons in the brainstem mostly receive signals from the body, and pass that on to other neurons that mostly make things happen in the body, whether muscular or chemical actions. Strictly speaking, those are the neurons that operate the body. But neurons in the cerebral cortex, riding on top of the brainstem, get a copy of everything that comes in, and are organised in a way that allow those copies to reverberate, bounce around, echo, and form associations with each other – all before intervening in the actions of the body and modifying them. Acting on the rest of the brain, the cerebral cortex has an opportunity to endow the actions of the individual with complexity and flexibility. Without a cerebral cortex, one lives ever in the present; but whoever has a cerebral cortex gains a record of the past and the capability to use the past to forecast and prepare for what lies ahead.

Whoever has the most neurons in the cerebral cortex, then, should have the most flexible and complex cognition. That species, it turns out, is us, even though the human brain is not the largest one around. Because our brain is built like any other primate brain, it has the advantage of packing a much larger number of neurons into the same volume as other brains. It thus follows that, as the largest-brained primate, we have the most neurons in the cerebral cortex of any species: 16 billion of them, on average. That, as we have established in my lab over the last ten years, is three times as many neurons as the African elephant harbours in its cortex, despite that cortex being twice the size of ours.

How we came to be that primate species with the most neurons is an interesting story along the lines of “rich-gets-richer”. Our ancestors of some three million years ago had brains about the same size as the modern chimpanzee – which is to say, already with a respectable seven billion neurons or so (judging from the size of fossilised skulls), brains that we know to be capable of using tools and even crafting simple ones, like poking-sticks made out of branches. But unlike the modern chimpanzee, that ancestor had straightened up and walked upright on its hind limbs only, which cut in half the amount of energy required to go places, and therefore multiplied by two the distance that they could roam a day.

Going the distance is important for animals because it enhances the ability to find food. And food, which is crucial for keeping the brain functional (it is the second most expensive organ in the body in terms of daily energy requirement after the liver), must have been at a premium. Chimpanzees spend six to seven hours a day eating; gorillas and orangutang, over eight – and if that is not sufficient time to gather and ingest enough calories, they simply lose weight. If human beings ate like other primates do, we would have to spend about nine-and-a-half hours a day eating, every day of our lives. Just imagine what your life would be like. Forget college, school, or work; finding food and ingesting it would consume most of our waking lives.

What changed our evolutionary history – perhaps aided by the bipedal posture that freed the hands to craft and carry and allowed the body to go twice the distance – was that more and more neurons became affordable to our ancestors. With enough energy available and hands free to manipulate, our ancestors of some three million years ago started carving tools out of stone that could kill and cut through prey much faster than any still respectable teeth; they could then also modify that meat and whatever else might serve as food, crushing, cutting and pounding, before ingesting it. Our ancestors of three million years ago thus invented the first human technology, stone tools, and put them to use to craft the second: cooking.

In the modern world, cooking is associated with heat, but the word should really be applied to any kind of food preparation, whether or not it involves heat. Cutting, crushing, mincing, pureeing, juicing or tenderising with acids are all types of cold cooking and they achieve the same important modification: they greatly increase the amount of energy that the body can obtain from the same food. Using stone tools on meat or roots was therefore the earliest form of cooking, a technology that reduces the amount of time and effort required to chew and then swallow the food, which has been made softer. This process increases the energy that is transferred from food to the body by 30 to 100 per cent, as once completely digested (instead of swallowed in chunks) every last bit of cooked food is captured by the body. If anything distinguishes us, it is that we are the only species that pre-digests its food so thoroughly. The extra energy harnessed by that early technological trick seems to have shaped our evolutionary history, liberating our crafty ancestors from the energy constraints that continue to apply to all raw-food-eating primates, and affording them an unprecedented number of energy-hungry neurons in the cerebral cortex. With more cortical neurons came greater cognitive capabilities, but cooking was the gift that kept on giving, for once laborious and wasteful chewing became easy and efficient, those newly available neurons also had a benefit of their own to enjoy: free time.

Cooking must thus have given us both the brain and the time that allows for taking an interest in problems, noticing patterns, elaborating hypotheses about the world and setting out to test them, as well as crafting tools that help solve those problems faster. Our biology is capable of creating technology, whether in the form of knowledge, methods, or instruments. But left to its own devices, that brain would have to learn everything from scratch, over and over again. The largest number of cortical neurons of any brain would not suffice if young human beings had to retrace the steps of every one of their ancestors and their discoveries before finally adding their own discoveries to the list: from stone tools, fire, and spears all the way to the latest smartphone and jet engine. That every new generation does not have to start over is a testament to another achievement of our biology: longer lives that foster the cultural transfer of science and technology down the generations, be it through direct personal instruction, through mentoring or schooling, or through the indirect handing down of ideas and know- how in physical repositories of knowledge. However it happens, the increased number of cortical neurons brings (as for other warm-blooded species) extended childhood and even more years of life after sexual maturity, during which each generation has an opportunity to reanalyse what was passed on to it, to add more knowledge to that corpus, and to create a new synthesis that can be transmitted to the next generation.

Cultural transfer, or learning from others, is not just useful, but fundamental. Lego blocks have to be assembled before the whole conveys any meaning, and our brains are no different. In adulthood, already accomplished creatures, it is easy to forget that we started life full of possibilities, just as a pile of Lego blocks can be used to build most shapes, but at that point aren’t much more than promising yet disorganised building blocks. Moreover, most of the “assembly instructions” for the brain, the information that shapes neurons into complex meaningful networks that can achieve wonders, are not even written in the genes (there are not enough of them for that), but assimilated through experiencing the environment.

As we go about life, the brain interacts with the world and learns by trial and error what and who is important, and what works or doesn’t. Unlike plastic Lego blocks, it undergoes self-organisation, which comes naturally to our neurons and to biology as a whole. Those blocks that assemble circuits which chance to make something useful happen are preserved, the connections that stitch them together strengthened; others, found to be useless, are destroyed and their materials repurposed, as new circuits form haphazardly, generating new possibilities that will be exposed to the test of use. The brain is born with about as many neurons in the cerebral cortex – that part of the brain which generates complexity and flexibility – as it will have as an adult, but until it gets there it undergoes a rollercoaster ride of gaining and then losing neurons, then gaining some more, throughout which process new circuits are tried and tested. And so, over time and as a function of life experiences, each of us develops a brain that has the capability of speech, but forms to learn only those particular languages that it hears; that has the capability of controlling the body – but becomes as clumsy or agile as we demand of it, and as is useful and meaningful for our lives. Our brain is thus born full of capabilities, but those have to be shaped into actual abilities by use: we are a combination of the brains we were born with, and what we make of them – and that takes a lifetime. It is no wonder that having a variety of opportunities in early life is so enriching. It is also no wonder that we become more and more ourselves over our lifespan, better and more finely tuned versions of what we grew up doing and thinking and feeling.

That is where science and technology and education as a whole come in, guiding our natural curiosity about the world into organised experimentation, in schools and labs and beyond them, generating knowledge which is then applied to new objects, systems and procedures that make problem-solving easier: technologies. This knowledge about the world and our bodies, harnessed by centuries of critical thinking and inquiry and systematic investigation, shapes our minds as it is passed on from generation to generation. Formal education and cultural transmission of know-what and know-how have played such a fundamental role in shaping those 16 billion cortical neurons into our able minds over the last million years that we need to be reminded of it.

This is why I am fascinated by post-catastrophe science fiction stories and their speculations about what life would be like if we still had the same biological brains as our ancestors from the last Ice Age, the time of early farming, ancient Greece or the medieval period probably already had – but no longer had the information to feed them. Any epidemic that eliminated some 90 per cent of the population would have devastating consequences that go way beyond the personal tragedy of lives lost. First would come the shock of living in a world where electricity and digital information trading no longer existed for lack of fuel. We would go back to producing and annotating knowledge with our hands and writing implements, to climbing long flights of stairs on our own, to being restricted to the immediate couple of miles of land – and communicating only with those around us.

Next would come the realisation that we depend on food supply chains that involve thousands of hands and minds; backyard farming would have to be learned all over again. Then, as useful objects break or wear out, would come the shock of discovering that, as well educated as we may have been, we no longer dominate the know-how nor have the skills required to craft our own technology. I have a PhD in neuroscience, but would not know how to fashion a pen or press paper on which to write. Perhaps I could mould new pots and plates out of clay, but I would have to figure out by painstaking trial and error how to fire them so they didn’t crack and spill my food. Making my own knives? Forget it. I have learned about steel forging, but I would not know how to operate a steel plant or forge my own blade into shape – much less mine the required primary materials when that time came. At best, I could probably crack some rocks into the right shape for use as hand-held sharp blades, like my Stone Age ancestors could. We do have the same underlying biology, after all.

We have become so good at putting our 16 billion cortical neurons to good use that, at this point, no one person could acquire, much less retain, all of the know-what and know-how that our species has developed. Humankind has far surpassed what any individual could achieve. And all the while, we have remained as much a primate as any other: our brain is no different than could be expected of a generic primate with as many neurons as we have acquired.

Thanks to that biological advantage, we have become so accomplished at investigating our own world, generating and updating knowledge, passing it on and applying it to technological development, that we have become dangerously dependent on our advanced technology. It is a good problem to have – one that brings with it the safety of temperature-controlled housing, refrigerators, drugs and vaccines, anaesthesia and surgery, and the wonders enabled by metallurgy and electronics. But it relies completely on us remembering to use our brains to keep alive and well the science and the technology that shape our biological capabilities into the modern abilities that we prize so highly – and to pass it on through education. Just as I have learned to bow to my kitchen and knives in gratitude to our ancestors who came up with those first technologies, I have also learned to prize and respect teachers, professors and instructors in all crafts and fields of knowledge. Without them, we are on our own: biology unaided by culture.

This essay originally appeared in ‘Knowledge and Information – Perspectives from Engelsberg Seminar, 2018’, Bokförlaget Stolpe, in collaboration with the Axel and Margaret Ax:son Johnson Foundation.

Author

Suzana Herculano-Houzel