How mechanical refrigeration changed the world

The elaborate infrastructure we have created in order to eat food before it rots is one of the great triumphs of modern civilisation.

Advertisement of a huge open refrigerator stuffed with food, with a little girl reaching for a pink cake, 1955. Screen print from a photograph.
Advertisement of a huge open refrigerator stuffed with food, with a little girl reaching for a pink cake, 1955. Screen print from a photograph. (Photo by GraphicaArtis/Getty Images)

Imagine a world without refrigeration or freezing. Ice would be a rarity, but more importantly, the perishable foods we eat every day would be much more rare, and the ones that we could eat would taste totally different. You can preserve perishable foods without refrigeration, but that requires various strategies that affect the taste of the food. Smoking, drying, pickling and other traditional methods of food preservation have produced new foods that are extremely tasty in their own right. Other times the new creations that preservation methods have inspired are what we might call ‘acquired tastes’. But if there weren’t at least some people that liked a particular new creation, it’s unlikely anybody would bother to continue to preserve food that way. For some people decay equals fermentation. For others it’s rot. It’s all a matter of science and taste.

That last line is actually a paraphrase of the food scientist Kantha Shelke. Her point is to show that there is always an optimum point of palatability for any particular perishable food. Up to that point the food is ripening; after reaching that point the food is decaying. Since it is difficult to time our meals to reach that optimum point every time we sit down at the table, we eat foods that are unripe or decaying all the time. In other words, whether food is fresh or not is not an either/or proposition. Palatability is a sliding scale.

That optimum point of palatability is also culturally relative. Sometimes people in particular cultures prefer to eat perishable food products before they reach the point when they begin to decay. (For example, as someone who has an aversion to brown spots, I think of bananas.) More often, people eat things that are past that optimum point because they believe that partially decayed foods taste better. Beer and chocolate and cheese are all examples of common foods that depend upon rotting for their very existence. However, that rotting is closely controlled.

Since even refrigeration cannot stop decay, the people who produce and market these foods ride the top of an inevitable natural wave in order to make something even better than what nature has provided. Controlled fermentation also became a way to deal with the inevitable surpluses of the harvest so that people would still have something to eat after that surplus had reached the end of its natural life. Preservation defied nature because it made the seasons less important – only the flavour would change after the food in question got preserved. Apples are still apples whether in spring or fall. Only the form in which people ate those perishables changed. Once mechanical refrigeration became a thing, cold storage made it possible to defy seasons entirely by serving perishable foods out of season that still tasted comparable to the food at optimum time.

Many traditional forms of food preservation predate refrigeration of any kind. Drying, for example, dates from ancient times; so does salting and pickling. The gradual development of these methods (which depended to a great extent upon geography and climate) helped define the cuisines and cultures of people around the world. ‘Scandinavian and Russian people love sour tastes,’ explains the British TV presenter Sue Shephard, ‘while in Eastern Europe sharp vinegary tastes became popular. All over the world, poor people persisting on dull, monotonous cereal-based diets made highly flavoured preserved sauces, pickles and relishes to pep up their meals.’ Preservation even expanded the number of things that people could eat, rendering previously inedible plants edible and sometimes making less healthy foods healthier.

Entirely new foods evolved more recently through the same kind of happy accidents that led to the improvement of preservation techniques over time. Dried pasta, for example, has fuzzy historical origins, but was obviously a successful wheat preservation technique. Bacon developed from the salting of the least-expensive parts of slaughtered pigs and has had a central part in the European diet for centuries. Polish bakers started making bagels in about 1500 to get around Jewish dietary restrictions. Their crusty outsides helped keep the inside of the bread soft and chewy. Dip them in hot liquid and it will all soften instantly.

Canning is a more recent food preservation technique and dates from the beginning of the nineteenth century. The Frenchman Nicholas Appert pioneered this practice in 1809 when he won a contest sponsored by Napoleon. While it was hardly fool proof in its early stages, canning made it at least possible to feed hungry armies out of season. The early jars and later cans were inconvenient and there was a constant threat of botulism during this technology’s early years. More importantly, canning was extraordinarily labour intensive, requiring hours of work – inevitably work performed by women – in order to save surplus food. In fact, most traditional food preservation techniques were labour intensive, requiring whole communities to join in for them to work at the necessary scale.

Chemical food preservatives, particularly those derived from coal tar, held the potential to keep food fresh without requiring substantial amounts of labour and without affecting the taste of the food the same way that traditional techniques did. The use of these preservatives spiked in the decades between the 1870s, when the development of synthetic organic chemistry in Germany allowed the use of chemical food preservatives, and the late 1920s, when refrigeration became widespread. By 1900, the United States Department of Agriculture reported the use of 152 patented chemical preservatives on the market.

Borax, the crystalized version of boric acid, is an example of an early chemical food preservative. It is sometimes known as sodium borate, sodium tetraborate, or disodium tetraborate. Today it is used as a household cleaner, a laundry detergent, a fire retardant, an ingredient in cosmetics, and even as a substitute for mercury to extract gold from gold ore. Before the late nineteenth century, people used it primarily for medicinal purposes. Doctors used borax to treat labour pains, menstruation troubles – even diarrhoea. ‘We find [borax] the very best cockroach exterminator yet discovered,’ reported Manufacturer and Builder in 1871. The same magazine reported that it was ‘perfectly harmless to human beings’, which explains why people considered consuming it themselves, even if only as a by-product of the food preservation process.

American meatpackers first used the newly cheap borax during the 1870s when they started exporting their products for the first time. People already knew borax was an antiseptic, so its use as a preservative made sense. The use of borax also allowed packers to significantly cut the time and expense needed for salting, pickling or smoking meat. Butchers used borax as a way to preserve chopped meat, but it was also an important way to preserve butter for export. At the turn of the twentieth century, all butter headed for England contained borax. Since few ships had good refrigeration, it never would have kept otherwise. Even when refrigerated shipping became more common, coating pork and beef with borax limited the damage caused by decay under inefficient refrigeration or at places along the food chain that had no refrigeration at all.

When Harvey W. Wiley, the head of the Bureau of Chemistry at the United States Department of Agriculture, began experiments to test the safety of common food additives in 1902, he started with borax because he thought it was ‘the least objectionable’ choice with respect to the health of his subjects. Over the course of his experiments, Wiley fed his subjects small doses of preservatives and then monitored every aspect of their bodily functions. Wiley’s subjects, mostly young male clerks who already worked at the department, could not much as take a glass of water elsewhere without reporting it to Wiley.

Thanks to a reporter from the Washington Post, the participants in these experiments were dubbed the ‘Poison Squad’. While that sounds brave, the reporter’s tone in the articles was far from charitable: it was not hard to satirize a series of experiments that weren’t going particularly well anyway. Wiley had initially planned to give his subjects the borax surreptitiously by slipping it in the butter. When the clerks eating the meals realized this, they started eating far less butter. When Wiley switched it to meat and the then milk, the same thing happened. Eventually, Wiley gave in and started administering the borax in capsule form.

Because of all the attention that the Poison Squad experiments received, Wiley became famous. These experiments were nonetheless terrible science. For one thing, even though Wiley’s subjects consumed borax in small doses, those doses were far larger than any ordinary American would have consumed by eating meat or dairy products preserved by borax. More importantly, none of Wiley’s Poison Squad experiments used a control group. Although Wiley recorded the effects of the preservatives on the subjects, he had no group to form a basis for a comparison.

While this may seem like a small failing, it was actually a basic failure of the scientific method. Since some subjects developed a tendency to avoid foods that they incorrectly guessed contained added borax, the same phantom tastes might just have easily brought on phantom pains. That may explain why Wiley was practically alone in the world scientific community in condemning borax. Many similar experiments investigating the effects of borax on the human system completely acquitted the substance. Despite such problems, Wiley used the Poison Squad experiments as his basis for supporting the removal of borax entirely from the food supply. It was banned with the passage of the Pure Food and Drug Act of 1906, a law basically written by Wiley.

On the other hand, Wiley also tested another common chemical preservative that was universally condemned as unhealthy: formaldehyde. Drop just a bit of the stuff in milk and it will stay fresh without refrigeration. Besides being poisonous, Wiley explained the further danger of milk preserved this way in an 1908 report: ‘the milk is prevented from becoming and thus indicating its age and the danger signal is thus removed, while other organisms which are capable of producing disease continue to multiply in the milk with the same rapidity as if the formaldehyde were not present.’ Unlike with borax, formaldehyde was banned from the US food supply with hardly any pushback.

Earlier methods of food preservation were generally very successful. Cheese, for example, can be kept fresh for up to a year if its component parts are completely separated. There were advantages to preserving apples in the form of hard cider – indeed, some people think apples are actually better that way. This is why pickles, dried fruit, and canned vegetables all have their place at the modern table. While chemical food preservatives still exist, the onset of mechanical refrigeration (starting right around the time that Wiley conducted his experiments) made them much less important moving forward. Had mechanical refrigeration never been invented, people around the world would still have earlier methods of preservation to keep their food from spoiling. But mechanical refrigeration offered the possibility of keeping food fresh longer, more efficiently and more effectively.

Mechanical refrigeration also helped feed people who might not have been fed if that technology had not existed. Thomas Malthus’ apocalyptic early nineteenth-century predictions about future population growth contained two possible outs for the marginal countries at that time: either they had to produce more food or acquire it from somewhere else. Refrigerated transport, like the transport of food in general, was (and still is) an important way for countries that could not produce their own perishable foods to acquire them from abroad. Many farmers across the world in the late nineteenth century began to industrialize their operations so that they could produce more and meet the demand of growing populations. Efficient, intensive mechanized agriculture would have been useless if there had been no means to transport the surpluses it created to the markets that needed the extra supply. Now nearly every country in the world has become a part of the global food chain – if not as a consumer then as a supplier.

Refrigerating engineers refer to the infrastructure that makes such paths of trade possible as cold chains. Cold chains are made up of linked refrigeration technologies needed to preserve and transport perishable food from where it was grown to where it gets eaten. Different businesses using different technologies gradually organized long chains to take both ice and perishable food across America and the world in order to make a profit. While such activities are now routine, they were extraordinary when they first began and the only reason most people don’t stop and think about how extraordinary they still are is that the successful and efficient movement of perishable food is part of the background of everyday life. The most visible manifestation of the cold chain is the modern household refrigerator, but that is just its end point. Storage, transport – even the display cases in grocery stores – have to be reliably refrigerated for a cold chain to be effective, and the development of all these links took time and enormous amount of technological innovation.

While it is probably better to think of the modern cold chain as a web of refrigerated transport paths for perishable foods of all kinds. For each perishable product there was a point of production where cold could be ‘created’. In 1806, the Boston merchant Frederic Tudor began harvesting ice from New England lakes, packing it onto ships and selling it around the world. The ice here was the product that the cold chain protected, and the good thing about ice as a commodity is that once it melts the remaining ice is still usable. Tudor’s experiments to slip perishable food into his ice shipments failed. Perishable food required a more reliable source of cold; more importantly, that cold had to be mobile: if perishable food spoiled in transport, it would be unmarketable even if there was a means to refrigerate it once it arrived at its destination. The first experiments in mechanical refrigeration were designed to keep meat fresh as it travelled between continents by ship because meat was the most expensive commodity available. Thanks to the development of refrigerated transport, many trade routes between continents for dressed meat existed during the late nineteenth and early twentieth centuries. American chilled beef quickly captured the English market during the mid-1870s. It was fresh, cheap and well preserved because the cold chain between Chicago and London was so effective. Australian mutton, on the other hand, took years to find a market in England (and even then it was a product consumed mostly by poor people) because Australia’s cold chain was so ineffective. British customers often found the product spoiled upon arrival. Sometimes whole shipments became inedible when the technology failed.

The trade in dressed meat by railway cars proved much more lucrative. The Chicago meatpacker Gustavus Swift pioneered this technology during the 1880s. Instead of shipping live cattle to the East Coast, Swift began to slaughter cows in Chicago and shipped the tastiest parts in ready-to-eat form instead. He was able to do this by lining up icing stations all along the railway from Chicago to New York and other cities along the eastern seaboard. When the ice melted, more tonnes were added. Swift’s railway cars were specifically designed to conserve as much ice as possible and to keep the ice away from the meat, since direct contact would turn it black. Ice remained the preeminent means of refrigerating railway cars in the United States into the 1950s.

The turn of the century saw another breakthrough in food preservation: cold storage. Mechanical refrigeration could keep perishable foods of all kinds cold so that products with natural seasons like eggs or apples could now be obtained at any time of year. American consumers initially fretted about the possibility that cold storage warehousemen were creating deliberate shortages in order to drive up prices. Foreign consumers were even more suspicious. In France, consumers were concerned about how cold storage affected taste, and they too had concerns about price. ‘The French,’ writes the geographer Susanne Freidberg, ‘suspected even short-term, small-scale cold storage, because it spared merchants from having to bargain over or liquidate stocks at the day’s end.’ Such sentiments were only possible in cultures in which people were used to buying their food daily.

Cold storage became increasingly popular as its effectiveness improved. The turning point was World War I, when cold storage increased food supplies by preventing waste. The increased use of cold storage was one reason that predicted food shortages never occurred. As a result, the use of cold storage around the world skyrocketed after the war (but nowhere quite as quickly as in the United States). Cold storage remains an important part of the modern cold chain, but even the best-kept food has its limits. Refrigeration does not make food edible indefinitely; it merely slows the rate of decay. Even frozen food can go bad.

All these perishable foods needed a place to rest once they entered people’s homes. The first appliances to serve that function were iceboxes. Literally boxes with ice in them, iceboxes came in many sizes for many American homes, starting in the mid nineteenth century. (Iceboxes were not widely available in most other countries because few other countries had the cold chains that could support those boxes with ice.) While these were not the most effective weapons against rotting (for example, if you opened them too much the ice would melt faster), they did make it possible for American families to preserve leftovers for the first time, a revolution in eating all by itself.

The first electrical household refrigerators arrived in the United States during the mid-1910s. The first models were very expensive and not very practical. One brand required cutting a hole in the kitchen floor so that the machinery could be stored in the basement and attached to the appliance by a belt. Other equipment often failed. The breakthrough in American refrigeration came with the General Electric Monitor Top, the first reliable, efficient, low priced and quiet electric household refrigerator. By the end of World War II, more Americans owned refrigerators than not. By 1960, almost every home in the country had one. Gradually, refrigerators spread worldwide. Refrigeration was the one method of food preservation that everybody craved because it was so effective and easy to use. Just place your food in the fridge and decay could be staved off for an extended period of time with little effect upon taste. When a BBC reporter recently interviewed an Indian man who got the first refrigerator in his village, he responded, ‘I can focus on finding more work and not worry about food for the family. My wife will get more free time and perhaps she can give me a hand as well.’ In other words, mechanical refrigeration changes not only what people eat, but also their entire lifestyle. That’s why refrigerators can serve as symbols of modernity itself.

Of course, food still gets preserved with chemicals, just as pickling, drying and canning persist too. ‘Food companies and health fanatics,’ explains Patrick Allan on the web site Lifehacker, ‘try to make chemicals sound like the bad guy. It’s a big part of why there’s so much confusion about what’s actually healthy and nutritious. Even the “man made” chemicals are often misconstrued as nasty stuff, but there are plenty that are just as safe as any other naturally forming chemical.’ Certainly people are not dropping dead from chemicals in their food the same way they once did. Formaldehyde in milk is no longer poisoning babies.

However, nobody doubts that fresh food is best for everyone who can eat it. The great value of refrigeration as a food preservation technique for people all over the world is that it allows eaters to consume food in as close to its original state as possible. Decay begins immediately after anything is picked or killed or cut, but the ability of refrigeration to slow decay means that we don’t have to put anything else in most perishable foods, whether it’s healthy or not. Besides, fresh food tastes better.

As a scholar of refrigeration, I find it particularly interesting that food preservation has in some ways gone full circle. Perishable foods that were created to last longer are now being kept in refrigerators so that they can last longer still. Wine Mando, a Korean company, introduced the first kimchi refrigerator in 1995; it is now in 81.3 per cent of Korean households, and some homes in Korea have two or three of these appliances. Koreans developed kimchi, a condiment based upon fermented cabbage, so that the cabbage they grew would last longer. They kept it in clay pots in a hole in the ground. Since most Koreans prefer their kimchi cold, mechanical refrigeration has now improved upon what used to be a necessity. Instances like this, of technology improving upon nature, are why technology is so important for changing people’s lives.

There are downsides to refrigeration. The flavour of refrigerated foods cannot always match what you can taste at the point of production. The energy expended upon refrigeration contributes to global warming (but, yet again, so does the methane emitted by wasted food in landfills). What matters is that refrigeration – like the traditional methods of food preservation that came before it – is here to stay. The problem of dealing with decay has determined not only how what we eat tastes, but has come to define how we get much of what we eat in the first place. The elaborate infrastructure that the world has created in order to eat food before it rots is one of the great triumphs of modern civilisation. If you doubt that, ask somebody who lives in a country without widespread refrigeration whether they would want it or not, and see how quickly they answer in the affirmative.

This essay by Jonathan Rees was first published with the title Everything Rots in Decadence and Decay: Perspectives from the Engelsberg Seminar, Axess Publishing, 2019

Author

Jonathan Rees