Is farming the root of civilisation?

Humans discovered agriculture far earlier than previously thought; so why did they give up on it?

A relief depicting ears of grain from the reign of Akhenaten, c1353-1336 BC. Credit: Lanmas / Alamy Stock Photo.
A relief depicting ears of grain from the reign of Akhenaten, c1353-1336 BC. Credit: Lanmas / Alamy Stock Photo.

This essay originally appeared in ‘Civilisation: Perspectives from the Engelsberg Seminar’, Bokförlaget Stolpe, in collaboration with Axel and Margaret Ax:son Johnson Foundation, 2013.

When modern archaeology took shape as a discipline in the middle of the 19th century, one of its main achievements was to demonstrate the deep antiquity of our species: bones of humans were found with extinct animals in caves and in quarry gravels in sediments that the new science of geology demonstrated had to be far older than the date of 4004 BC, calculated by biblical scholars for the origins of humankind. By the 1860s, as excavations of prehistoric sites, such as burial mounds and caves, gathered apace, that ‘deep antiquity’ had been divided in Europe into a sequence of technological development: the three ages of stone, bronze and iron. The Stone Age was then further divided into a Palaeolithic, or Old Stone Age, when people used flaked tools and hunted animals long extinct, such as mammoth and woolly rhinoceros, in climates very different from today’s (the Ice Ages, or what came to be called the Pleistocene period) and a Neolithic or New Stone Age, when people used polished stone tools and kept domestic animals in environments similar to today’s.

In his Prehistoric Phases of 1872, Hodder Westropp also recognised a Middle Stone Age, or Mesolithic, when people were living in environments like today’s, but did not know about farming, and hunted animals of the European forests, such as red deer, wild cattle and wild boar. Like most of his contemporaries, he linked the emerging sequence of prehistoric phases with stages in the evolution of human culture. The Palaeolithic was the age of savagery when ‘man was scarcely distinguishable from the brute’. The Mesolithic was the age of barbarism, when man ‘lived as the tiger lives, catching his prey by his superior cunning, strength and pluck’. The Neolithic was the age of pastoralism, when ‘the cow yields him milk and the goat yields him cloth; yet he wins these requisites from them, not by murderous cunning, but by tender love’. Westropp thought that crop cultivation came later, in the Bronze Age, but by the 1880s, it was clear that animal and plant husbandry were both associated with the Neolithic. By the end of the 19th century, the discoveries of the wonderful painted caves of France and Spain showed that Upper Palaeolithic ‘savage’ hunters of the late Pleistocene were capable of remarkable artistic achievements, but nothing comparable was associated with the Mesolithic and the Neolithic was uniformly recognised as the watershed in human cultural development, as the age of farming, because food production allowed people to settle down and develop complex culture: ‘Agriculture may be considered the most important step in the development of civilisation,’ Westropp wrote.

Present understanding of the origins of agriculture owes much to concept of the ‘Neolithic Revolution’, proposed by the Australian prehistorian, Gordon Childe, in books such as Man Makes Himself (1936) and What Happened in History (1942). Writing primarily about the Near East, he argued that hunter-gatherers invented farming because it gave people a reliable food supply that allowed them to settle down which, in combination with the possibilities farming created for producing surplus food, provided the springboard for global population growth and transformations in social complexity that led, within a few millennia, to urbanism (the Urban Revolution, in Childe’s phrase). He speculated that the context for hunter-gatherers experimenting with farming was probably changes to the climate at the end of the Pleistocene, with more arid climates forcing plants, animals and people close together in oases and river valleys (his ‘oasis hypothesis’), though archaeological expeditions to the Near East in the 1950s and early 1960s showed that the change to the modern climatic era (the Holocene) around 11,000 years ago was marked in the region by increased rainfall, rather than aridity. These expeditions were characterised by the involvement of environmental and archaeological scientists who were able to date their excavated sites by the new method of radiocarbon dating (dating organic materials such as charcoal) and collect direct evidence for agriculture in the form of animal bones (the new study of ‘archaeozoology’) and seeds (‘archaeobotany’).

The excavation of sites such as Jarmo in Iraqi Kurdistan by Robert Braidwood, Jericho in Israel by Kathleen Kenyon and Ali Kosh in Iran by Frank Hole and Kent Flannery, showed that Neolithic farming communities had developed across the Near East by about 8000 BC. The people lived in villages of small houses made of packed mud, used polished stone tools, as well as flaked ones (pottery, thought by Childe to be a signifier of Neolithic life, was not used until a few thousand years later), grew crops such as wheat, barley and legumes and kept domestic animals – cattle, pigs, sheep, goats and dogs.

Comparable research in other parts of the world in the 1960s and 1970s indicated that rather similar ‘Neolithic farming communities’ developed in the millennia following the transition from the Pleistocene to the Holocene, using different combinations of plants and animals, such as rice and pigs in China and maize and other vegetables in the Americas. The consensus developed that the Neolithic Revolution began in a few ‘hearths of domestication’ – the Near East, China, the eastern part of North America, central America, Peru and, perhaps, the Sahelian region in Africa, with farmers then spreading out to adjacent regions, taking the new way of life with them.

The assumption that farming had obvious advantages over hunting and gathering was severely shaken by ethnographic studies of present-day hunter-gatherers, published in the late 1960s and 1970s. Even in a hostile environment like the Kalahari desert, the !Kung San hunter-gatherers living there were shown to have a secure food base and to work fewer hours a day than most subsistence farmers (they spent as much time gossiping as hunting or gathering!). So, in subsequent decades, archaeologists have theorised about the processes that might have persuaded foragers to become farmers, even though farming probably meant more work, a less varied diet and more disease, to say nothing of the social stresses of moving from group food-sharing to household-based production. They have variously proposed ‘push’ factors, such as climate forcing and/or population pressure and, more recently, ‘pull’ factors, such as social competition, or shifts in ideology, or combinations of all these. Whichever theory has been preferred, though, the unifying characteristic of most of them has been the assumption of a one-way journey from foraging (hunting and gathering) to farming: early food production – however initiated – promoted demographic growth, the accumulation of surpluses and, in time, the emergence of social complexity and inequality, much as Gordon Childe first argued in 1936 in Man Makes Himself. Also, most theorising has been based on the common-sense premise – usually implicit rather than explicit – that ethnographic (present-day) and ethno-historic (historically-recorded) foraging and subsistence farming societies are likely guides to how prehistoric societies behaved in the past. Recent archaeological research, however, is suggesting that transitions from foraging to farming (and I used the plural ‘transitions’ deliberately) were much more complex.

There is widespread evidence for modern humans (our own species,Homo sapiens) in the Late Pleistocene in many different parts of the globe, demonstrating subsistence practices that, in one form or another, presaged the later relationships to the landscape and the natural resources within it that we describe as agriculture. Modern humans as a skeletal type date back around 200,000 years in East and South Africa and one of the keenest debates in archaeology today is about when we started to behave in cognitively complex ways. We can’t know directly when complex language began, for example, but we can draw inferences from aspects of behaviour, such as the use of symbols. Although the most spectacular evidence for complex symbolism is associated with the first modern humans to reach Europe around 40,000 years ago (the cave art, decorated bone-work, elaborate burials and so on of the Upper Palaeolithic), in Africa there are indications of the use of ochre pigment (for body painting?) already by 100,000 years ago and of decorative shells for body ornament by 80,000 years ago. Around 37,000 years ago, a teenage girl was buried in Niah Cave in Borneo, accompanied by bright quartz pebbles, brought a great distance from elsewhere in Borneo, and there are fragments of human skull with red pigment on the inside that may have been part of rituals involving body painting. Furthermore, my own excavations at Niah have shown that the first people there, 50,000 years ago, had learned to exploit the rainforest for a variety of roots and tubers, fruits and nuts, removing the toxins in many of them by storing them in ash-filled pits and burning clearings and disturbed areas in the forest to enhance the growth of tubers and attract pigs to the clearings. Similar evidence has been found at Kosipe in New Guinea, dating to around 45,000 years ago, and at Ille Cave in the southern Philippines at the Pleistocene/Holocene transition.

As several scholars have proposed, the terms ‘arboriculture’ and ‘vegeculture’ usefully describe the variety of plant-management strategies practiced by the Late Pleistocene and Early Holocene foragers in Southeast Asia, thousands of years before the commonly assumed date of Neolithic rice farming. And on the Ice Age tundras of central and northern Europe, an increasing number of dog-like skeletal remains found at human occupation sites indicate that, by 30,000 years ago, Upper Palaeolithic hunters – the makers of the famous cave art of Chauvet, Lascaux and Altamira – may have started to domesticate wolves, because of the tamed animal’s value for assisting with hunting and tracking, for transport and as protection against predators, as well indeed as for companionship (in this respect it is interesting that there are examples of humans and dogs being buried together).

It is becoming clear that many Late Pleistocene and Early Holocene ‘foragers’ were engaging in cultivation and herding for long periods before their practices altered plants and animals in ways that can be formally recognised as domestication by archaeobotanists and archaeo-zoologists. In southwest Asia many Natufian (Late Pleistocene) and pre-pottery Neolithic (Initial Holocene) foragers engaged in an array of horticultural practices in their exploitation of wild cereals – preparing ground, planting, weeding, warding off predators, harvesting – that did not result in morphological or genetic changes to the plants that can be formally categorised as ‘domestication’. The development of cultivation technologies and systems in the New Guinea highlands was similarly protracted. In the African Sahel, thousands of years of harvesting sorghum, by beating or shaking the seeds off the head into a basket, did not promote the morphological changes of domestication. In China, Early Holocene foragers, who relied mostly on nuts and water chestnuts for their plant food, also collected, cultivated and harvested as a green crop morphologically-wild rice and there was a similar protracted history of millet domestication.

Modern-day politics have prevented new archaeological fieldwork investigating agricultural beginnings in many parts of the world since the 1970s, for example in Iraq, Iran, Pakistan, Kashmir and Afghanistan. However, analysis of DNA in modern populations of plants and animal and of ancient DNA in plant remains and animal bones from archaeological sites is indicating that animals and plants were domesticated in many more regions of the world than the orthodox model of a few ‘hearths of domestication’ envisaged, with multiple domestications apparent for wheat, barley, rice, millet, sheep, goats, cattle, pigs and (later) horses. In addition to the regions regarded as the primary ‘hearths of domestication’, such as the Near East, China, Mesoamerica and Peru, we can add central Asia, south Asia, Japan, the African Sahel, New Guinea and several parts of the Americas to this list. The growing archaeobotanical and archaeozoological evidence is corroborating the archaeogenetics. The archaeological record is also revealing far more complex trajectories of people’s changing relations with plants and animals in the opening millennia of the Holocene than is envisaged in most accounts of the adoption of agriculture by foraging populations. Some forager societies combined elements of the ‘Neolithic package’ (polished tools, pottery, animal herding, plant cultivation) for centuries or millennia before developing a significant commitment to agriculture. There are other examples of foraging societies that developed that commitment with remarkable rapidity, within a couple of generations. Both pathways are evident in Europe, one of the best researched regions of the world for Neolithic studies, often in adjacent regions, and there are several regions such as the Atlantic coast from Brittany to Denmark, where foragers and farmers were in contact with each other for centuries. Some of these coastal forager societies lived in sedentary villages and were characterised by many of the social traits normally thought to be associated with farmers, such as surplus accumulation, feasting, formal cemeteries and small-scale warfare. Many ‘Neolithic agricultural societies’ depended more on wild foods than on farmed foods.

Also, there is a growing number of examples of experiments with managing plants or animals that failed to survive into modern farming. In the Near East, some plants that we regard as weeds appear to have been managed intensively and possibly cultivated, alongside the plants that eventually emerged as the major domesticates. In the Libyan Sahara, Mesolithic people seem to have tried to herd the wild Barbary sheep, an animal that had been hunted there for many millennia, before they turned to herding sheep and goats that they had acquired by trade with people in the Nile valley. In Borneo, foragers used sago as their plant staple for hundreds if not thousands of years, even though they were acquainted with rice. There are examples of people adopting animal and/or plant husbandry but then reverting to foraging and of others making a journey from foraging to farming to foraging to farming – we may be seeing this at the Niah Caves, for example, where the body chemistry of skeletons buried in the caves 4,000–2,000 years ago shows dietary change from foraging to farming to foraging and rice tempers in pottery then indicating a switch to intensive rice farming a few hundred years ago.

Finally, whereas so much scholarship has focused on the change from foraging to farming in terms of its economic aspects, such as more efficient food production, greater propensity for surplus accumulation and the like, it is increasingly clear that motivations for acquiring domesticates were highly variable and rarely simply a matter of dietary stress opportunity. At some of the earliest sites where ‘initial farming’ was practised in the Near East, for example, the cereal seeds, the flint sickle blades that would have been used to harvest them and the grindstones that would have been used to process them are found in highly ritualised contexts, suggesting that their use involved complex rituals and ceremonies. In Southeast Asia today, rice has a sacred or quasi-sacred status, its growing is highly ritualised and growing and eating it are associated with status and prestige. Penan hunter-gatherers in Borneo, who are being encouraged by government to turn to rice farming, are suspicious of rice precisely because it ‘needs people to grow it’ and separates them from the forest – psychologically as well as physically – in ways quite different from managing sago or cultivating yams. Our work in Borneo has shown that the cultivation of rice may have been actively resisted for thousands of years by prehistoric ‘vegeculturalists’ and that it only became a food staple a few centuries ago, associated with the development of communal longhouse living. For many foraging societies, engaging in the cultivation of mysterious and magical new foods may have been at least as much about cultivating social relationships as about filling stomachs.

Ever since the Enlightenment, ‘domestication’ has been reified as the watershed between a truly human way of life separate from nature and an earlier way of life that was part of nature and not truly human or civilised. The existence of such a boundary has run as a continual thread through most archaeological theorising about the origins of agriculture, resulting in the main questions being, first, of locating where and when in any particular region the boundary was crossed and, then, how and why. Yet the Western focus on separating humans from the rest of nature is just that, a particular notion of being in the world that is quite alien to many other societies. As the anthropologist d Tim Ingolhas written, many non-industrial and pre-industrial societies envisage the world not as we do, as sets of separate entities, such as people, plants and animals (and discrete species within these groups in the Linnaean sense), but as a web of relationships in which ‘species’ are defined in relation to each other, with the lives of people, plants and animals – and the physical landscape – intertwined in the playing out of social relations and worldviews. The archaeological record likewise is revealing complicated and ambiguous stories about foraging-farming transitions in many regions of the world that are very different to the traditional narrative of the Neolithic Revolution. People adopted, adapted or resisted the new ‘Neolithic’ technologies and foods in many different ways, at many different timescales and for widely differing reasons. Such decisions had many unintended consequences and, over the longue durée of prehistory, we can see that for many societies the eventual results were irrevocable changes to their social as well as physical landscapes and the transformation of once-optional additions to forager lifestyles into obligatory components of new ways of living. Hodder Westropp may have been right in recognising agriculture ‘as the most important step in the development of civilisation’, but it is important not to impose our own notions of rationality on our prehistoric ancestors and create a past in our own image. As Peter Rowley-Conwy observed in the case of the European Mesolithic, ‘we know that agriculture was to appear a thousand years later; they didn’t.’

Author

Graeme Barker