Friday, August 07, 2020

COVID-19 in the Evolutionary History of Infectious Disease: Will It Kill Liberal Globalism?

Will the COVID-19 pandemic kill liberal globalism?  

According to Curtis Yarvin, writing for the Claremont Institute's blog American Mind, it should.  Shortly after the report of the first case of a coronavirus infection in the U.S. in late January--a Washington state man who had travelled to Wuhan, China--Yarwin wrote an essay entitled "RIP Globalism, Dead of Coronavirus," in which he claimed that the only sensible response to the danger of a pandemic reaching the U.S. was to stop all air travel across the Pacific and across the Atlantic.  He recommended "suspending 1492," so that the two great hemispheres of the planet would be disconnected, just as they were before Columbus's voyage to the New World.  No only that, but every country should be totally isolated for ever, with no one permitted to travel into or out of any country.  All international trade should be stopped.  Every nation would be economically and culturally self-contained.  

He did not expect this to happen, because he saw that the internationalist belief in the goodness of our interconnected, globalized world is too strong to allow for the wisdom of an isolationist vision.  But he foresaw that the internationalist failure to close all national borders to the spread of the coronavirus would bring the death of millions of people around the world, which could lead more people to recognize the virtues of an isolationist world.  This could be something like Japan's policy of sakoku under the Tokugawa shogunate, in which both trade and travel across Japan's borders were generally prohibited.

Yarvin--who wrote under the name "Mencius Moldbug" for his blog Unqualified Reservations--is one of the most influential of the younger alt-right theorists who argue that liberal democracy has failed, and that what we need is the authoritarian rule of a monarch or a corporate CEO, following a policy of isolationist nationalism that scorns internationalist globalism (Tait 2019).  Yarvin was the person who recommended to Michael Anton that he read Bronze Age Pervert, another young alt-right thinker; and so this seems to have been a turning point in the move of the Claremont Institute towards the alt-right.  (I have written a series of posts on Anton and Bronze Age Pervert here and here.)

Yarvin is obviously right about how air travel in a globalized world has promoted the rapid spread of the COVID-19 virus around the world.  But I do not see how this proves that liberal globalism has failed, and that it needs to be replaced by a world of isolationist authoritarian regimes.  

We need to understand how the COVID-19 pandemic fits into the whole evolutionary history of infectious diseases from the Paleolithic to the present.  Ron Barrett and George Armelagos have written that history in their book An Unnatural History of Emerging Infections (2013).  In the first sentence of their book, they declare: "Microbes are the ultimate critics of modernity" (1, 115).  This introduces their Rousseauian epidemiological critique of modernity: the history of how human civilization moved away from the hunter-gatherer state of nature through the Agricultural Revolution and then the Industrial Revolution appears to be progressive improvement; but in fact, Barrett and Armelagos say, modern human beings suffer more from microbial infectious diseases than did their hunter-gatherer ancestors, which shows that the microbes--the bacteria, the viruses, and other microparasites--have evolved through mutation and reproduction to be the true masters of the earth.  This is an unnatural history because it moves away from the original natural condition of human beings as foragers in the Paleolithic.

That our modern globalized world is now suffering a catastrophic pandemic caused by a newly emerging coronavirus that has evolved to exploit the global interconnectedness of our world to infect us and kill us seems to show that.  This might seem to confirm Yarvin's argument that the COVID-19 pandemic exposes the vulnerability of globalist modernity. 

But I will argue that this evolutionary history of microbial diseases should teach us that while infectious disease has plagued humanity throughout all of history, and while newly emerging infectious diseases continue to threaten us today, a modern liberal globalized world can reduce, although never fully eliminate, that threat, so that life in our modern globalized world can be generally safer and healthier for more people than ever before in history.  This will be true, however, only as long as we preserve the freedom of a liberal social order that allows for the innovation necessary to meet the challenges that come from infectious diseases.

We can see that history as passing through four eras--the prehistoric foraging era, the agrarian era, the era of the industrial revolution, and the contemporary globalist era.  To fully explain this history, we need to understand both the microscopic and the macroscopic determinants of human infections.  

At the microscopic level, we try to understand how bacteria, viruses, and other microparasites have evolved to succeed (or fail) as human pathogens who must parasitize human beings for the survival and reproduction of the pathogen.  We should keep in mind, however, as I have indicated in a previous post, that virologist Marilyn Roossinck (2011, 2015, 2016) is probably right in suggesting that most bacteria and viruses are not harmful human parasites, because most of them have evolved to have either commensal (not harmful to the human host) or mutualistic (mutually beneficial) relationships.  For example, most of the bacteria and viruses in the human gut are necessary for human health.

It is only in the last century or two that modern science has given us some understanding of this microscopic world.  Some ancient natural philosophers--like Lucretius--have understood, however, that infectious disease can be caused by invisible pathogenic "seeds"--an intimation of the germ theory of disease.  I have written about this here and here.

At the macroscopic level, we can understand infectious diseases as social diseases, in the sense that they depend upon three social factors of human life--subsistence, settlement, and social organization.  The modes of human subsistence (such as foraging or farming), human settlement (such as nomadic bands or permanent urban living), and human social organization (such as egalitarian leveling or hierarchical classes) will influence our vulnerability to infectious diseases.


THE PREHISTORIC FORAGING ERA

For most of human evolutionary history, our ancestors lived as nomadic hunter-gatherers, in what the evolutionary psychologists call the "environment of evolutionary adaptation" (EEA).  Their mode of subsistence was predominantly gathering wild plants and hunting wild animals.  Their mode of settlement was for small bands of individuals to set up temporary campsites for no more than a few days or months at a time, so that they could move their camp many times a year to find the best locations for hunting and gathering. Their mode of social organization was to live in small groups where the adults were roughly equal, in that some individuals exercised informal leadership, but their power was checked by others who resisted any attempts at dominance.

The early modern political philosophers called this the "state of nature," and they disagreed about whether it was a state of peace and plenty or a state of war and poverty.  Thomas Hobbes declared that it was war and poverty, which made it a condition of desperate unhappiness.  Jean-Jacques Rousseau declared that it was peace and plenty, which made it the happiest condition for humanity.  Many social scientists today continue to take one side or the other in this debate.  So, for example, Marshall Sahlins said that Rousseau was right because our nomadic forager ancestors lived in the "original affluent society."  But others--like Steve Pinker--have said that Hobbes was right because our ancestors lived lives ruined by violence and scarcity.

Barrett and Armelagos say that the truth lies somewhere in between these two extremes.  But while they sometimes reject Sahlins' conception of the "original affluent society," they often endorse it and adopt a Rousseauian position (Barrett and Armelagos 2013, 1, 17, 22, 27-28, 111, 115).  They say nothing about John Locke's understanding of the state of nature, and so they do not consider my argument that Locke's account of the state of nature was mostly true, as confirmed by the anthropological evidence that we have today, and free from the mistakes of both Hobbes and Rousseau.  I have argued that hereherehere, and here.

The Rousseauianism of Barrett and Armelagos has a biblical dimension.  As the epigram for their chapter on the Agricultural Revolution (29), they quote from Genesis 3:17-19: "I have placed a curse on the ground.  All your life you will struggle to scratch a living from it.  It will grow thorns and thistles for you, though you will eat of its grains.  All your life you will sweat to produce food, until your dying day."  Of course, this is God's curse on Adam and Eve after their sin and their expulsion from the Garden of Eden.  The implication is that the prehistoric foraging life was the Garden of Eden for humanity--the Paradise that has been lost.

Barrett and Armelagos are certainly Rousseauian in their claim that foragers were generally healthier and less susceptible to infectious diseases than the later human beings living in agricultural and urban industrial societies.  They give three reasons for this.  First, hunting and gathering produced a wide diversity of foods, and this nutritious diet built up their immune systems that protected them from pathogens.  Second, because they lived in widely dispersed small groups that never settled in one spot for long, acute infections that require large and dense host populations were not sustainable.  Third, since foragers shared their food and other resources equally, there was no lower class of impoverished malnourished people who would be susceptible to infectious diseases.

I will not contest the last two points.  But I am skeptical about the first point.  They stress the importance of this point about nutrition among foragers: "Closely tied to human immunity, nutrition has always been our chief line of defense against infectious diseases.  Conversely, malnutrition is the chief determinant of immunosuppression worldwide" (20).

As one kind of evidence that foragers had a dietary diversity that supported their nutritional health, Barrett and Armelagos cite Kim Hill and A. Magdalena Hurtado's Ache Life History (1996) on the Ache foraging people of Paraguay.  During their forest living life, before they had contact with outsiders, the Ache hunted 56 animal species and gathered 44 plant species, which Barrett and Armelagos see as showing a remarkable diversity in their diet.  Barrett and Armelagos are silent, however, about Hill and Hurtado's observations about food shortages and infectious diseases among the Ache.  They observe that the Ache and other foraging groups often complain about their hunger.  They also suffer from poor health: illness and disease accounts for about a fourth of all deaths.  They suffer from a variety of viral infections.  They show the symptoms of diseases such as malaria, degue, amebic dysentery, and staphylococcal infections.  Hill and Hurtado conclude from this that Sahlins' "original affluent society" is a "farcical myth in modern anthropology" (320).

Barrett and Armelagos are enthusiastic proponents of the "Paleolithic Diet" argument of Stanley Boyd Eaton and Melvin Konner (Eaton and Konner 1985; Eaton et al. 1988).  The argument is that modern human beings are biologically adapted for the diet of their paleolithic ancestors, and that the mismatch between the modern diet and the paleolithic diet is responsible for the modern lifestyle diseases.  This has led a lot of people to try to revive their inner cave man by eating a "paleo diet" of plants and meats that might have been consumed by ancient foragers.

Barrett and Armelagos are silent, however, about the many devastating criticisms of this argument (Jabr 2013; Thompson et al. 2013; Turner and Thompson 2013; Zuk 2014).  First, modern humans have evolved over the past 7,000 years, so that their dietary adaptations are different from their foraging ancestors.  As one example of this, the cultural history of dairying societies created an environment in which  many people evolved a genetic mutation that allowed them to digest lactose--the sugar in milk--in adulthood.  In people without this mutation, the gene encoding lactase--the enzyme that breaks down lactose sugars in milk--shuts down after infancy when children are weaned from mother's milk.  Consuming milk as an adult is a Neolithic adaptation shaped by human niche construction.

A second criticism of the "paleo diet" is that today we don't have access to the foods that ancient foragers ate, because the plants and animals that we consume today are radically different from their ancient ancestral species--by artificial selection the domesticated plants and animals of today did not exist in the Paleolithic.  For example, cabbage, broccoli, cauliflower, brussels sprouts, and kale are all different cultivars of a single species--Brassica oleracea--that has been altered by human selection.

A third criticism is that anthropological studies of foraging societies over the past two centuries show a great diversity and flexibility in their diets that depends upon their socioecological circumstances.  So, for example, the diet of Inuit (Eskimo) foragers who live on fish and sea mammals will differ from the diet of foragers in the tropical rain forests of South America.

A fourth criticism is that there is some paleoarchaeological evidence that ancient foragers suffered from some of the diseases that are often assumed to be products of our distinctively modern diet and lifestyle.  For example, there is some evidence for atherosclerosis--arteries clogged with cholersterol and fats--in some ancient mummies buried by hunter-gatherers (Thompson et al. 2013; Wann et al. 2019).

Barrett and Armelagos do not mention, much less answer, these criticisms.


THE AGRICULTURAL ERA AND THE FIRST EPIDEMIOLOGICAL TRANSITION

About 7,000 years ago, some people in the Tigris-Euphrates valley began to settle into permanent settlements and to draw their food not just from foraging (hunting wild animals and gathering wild plants) but also from farming with domesticated plants and herding domesticated animals.  About 5,000 years ago, they began to form the first city-states (such as Uruk) that had formal governments with hierarchies of state authorities.  

This has generally been celebrated by historians and anthropologists as the Agricultural Revolution or the Neolithic Revolution--as the most progressive turn in human history because it allowed for urban civilization.  But Rousseauian anthropologists have lamented this as the biggest mistake in human history because human beings lost the freedom, equality, and healthy lifestyle of the foraging life, and it brought the tyrannical rule of kings, priests, and bureaucrats, constant warfare, oppressive taxation, slavery, and the emergence of acute infectious diseases.  I have written about James Scott's version of this argument.  I have also written about Kent Flannery and Joyce Marcus as developing a similar Rousseauian argument.

Barrett and Armelagos reinforce that argument by claiming that the Agricultural Revolution brought a "domestication of pathogens" that caused an unprecedented increase in acute infectious diseases to the point that they became the primary cause of human death.  They identify this as the First Epidemiological Transition.

The domestication of pathogens means that human villages and cities based on farming created selective conditions for the evolution and spread of infectious diseases.  The human patterns of subsistence, settlement, and social organization created the circumstances favoring microorganisms that could leap from nonhuman animals to human hosts, and then sustain human-to-human transition in human communities that were large, dense, and susceptible to infection.  The evidence for this can be found in the skeletons of Neolithic people showing the signs of malnutrition and infectious disease--particularly, among low-status people.  Because they lived in permanent and crowded farming communities in which people were in close proximity to their domesticated animals, this created opportunities for microscopic parasites to jump from animals to human beings and then spread widely through the communities.

Prior to 1492, however, infectious disease pandemics could not become fully global, because there was little communication between the Americas (the New World) and the rest of the Earth.  But then Columbus' voyage brought a globalization of human disease ecology--for the first time in history, infectious diseases could move around the entire world.  When the Europeans introduced new infectious pathogens into the New World, tens of millions of indigenous American people died because they had no immunity to the new parasites.  It worked in the other direction as well--for example, syphilis from the New World spread quickly throughout Eurasia.

This globalization of infectious disease was the most harmful expression of the First Epidemiological Transition.  This is what made the global COVID-19 pandemic possible.  And that's why Yarvin recommends "suspending 1492."


THE INDUSTRIAL REVOLUTION AND THE SECOND EPIDEMIOLOGICAL TRANSITION

Up to the end of the 19th century, infectious diseases were the primary cause of death, and most of these deaths were in childhood.  Between 1800 and 1840, 64 per cent of children in London died before reaching age 25.  The common experience of parents burying their children forced people to ask deep questions about the meaning of love and death in a world where most children did not live to adulthood.  In a previous post, I have written about Charles Darwin's struggle to understand the death of his daughter Annie at the age of 10.  She died of tuberculosis, which was then called "consumption," and there was no cure for it; nor was there any understanding of how it was caused by bacteria.

But then by the early part of the 20th century, childhood mortality had dropped dramatically, and the average human life expectancy rose.  As a result, the world human population grew from about 800 million to 1.6 billion in 1910--and finally to almost 8 billion today.  During this time, chronic degenerative diseases (like heart disease and cancer) replaced infectious diseases as the primary causes of death.  

This is what Barrett and Armelagos identify as the Second Epidemiological Transition.  There is a debate over its causes.  But they think the primary causes were better nutrition and the "sanitary reform movement" that cleaned up the water supply and the food.

Beginning in the 1930s and 1940s, a long line of antibiotics--such as penicillin and the sulfa drugs--began to save hundreds of millions of lives.  But then the overuse of antibiotics created a selective environment for the evolution of pathogens with antibiotic resistance, which is one of the causes of the Third Epidemiological Transition.


THE CONTEMPORARY GLOBALIST ERA AND THE THIRD EPIDEMIOLOGICAL TRANSITION

Within three years after the first use of penicillin as the "wonder drug" against bacterial infections, resistant strains of Staphylococcus aureus appeared in British and North American hospitals.  This began an evolutionary arms race in which the pathogens seem to be winning, because they evolve antibiotic resistance faster than we can develop new antibiotics.

Another contributor to this new epidemiological transition is that the unhealthy diets and lifestyles of people in the modern world make them prone to obesity, diabetes, cancer, and heart disease, and when this is combined with an aging population, ever more people are vulnerable to infectious disease.

The circumstances of the contemporary globalist era also create opportunities for the evolutionary emergence of new infectious pathogens.  These new pathogens usually originate in nonhuman animals--they are "zoonotic."  Some people have to hunt wild animals for food and money, and so they are prone to come into contact with infected animals; and on rare occasions, the pathogen can jump to the human host. These newly infected people live in densely populated cities where the pathogen can spread by human-to-human transmission.  Some of the infected human beings be international travelers who can carry the pathogen all over the world within a few days.  Finally, there is a large population of elderly people with pre-existing chronic diseases who are more likely to contract and spread the disease and also more likely to die from it.

All of these circumstances apply to the COVID-19 pandemic.

To be continued . . .


REFERENCES

Barrett, Ron, and George Armelagos. 2013. An Unnatural History of Emerging Infections. Oxford: Oxford University Press.

Eaton, S. B., and M. Konner. 1985. "Paleolithic Nutrition: A Consideration of Its Nature and Current Implications." New England Journal of Medicine 312:283-89.

Eaton, S. B., M. Shostak, and M. Konner. 1988. The Paleolithic Prescription: A Program of Diet and Exercise and a Design for Living. New York: Harper & Row.

Hill, Kim, and A. Magdalena Hurtado. 1996.  Ache Life History: The Ecology and Demography of a Foraging People. New York: Aldine de Gruyter.

Jabr, Ferris. 2013. "How to Really Eat Like a Hunter-Gatherer: Why the Paleo Diet is Half-Baked." Scientific American, June 3.

Roossinck, Marilyn. 2011. "The Good Viruses: Viral Mutualistic Symbioses." Nature Reviews Microbiology 9 (February): 99-108.

__________.  2015. "Move Over, Bacteria! Viruses Make Their Mark as Mutualistic Microbial Symbionts." Journal of Virology 89 (13): 6532-6535.

____________.  2016.  Virus: An Illustrated Guide to 101 Incredible Microbes.  Princeton, NJ: Princeton University Press.

Thompson, Randall, et al. 2013.  "Atherosclerosis across 4000 Years of Human History: The Horus Study of Four Ancient Populations." The Lancet 381 (issue 9873): 1211-1222.

Turner, Bethany L., and Amanda Thompson. 2013. "Beyond the Paleolithic Prescription: Incorporating Diversity and Flexibility in the Study of Human Diet Evolution." Nutrition Reviews 71 (8): 501-510.

Yarvin, Curtis. 2020. "RIP Globalism, Dead of Coronavirus." The American Mind. February 1.

Wann, L. Samuel, et al. 2019. "Atherosclerosis in 16th-Century Greenlandic Inuit Mummies." JAMA Netw Open 2(12):e1918270.

Zuk, Marlene. 2013. Paleofantasy: What Evolution Really Tells Us about Sex, Diet, and How We Live. New York: Norton.

2 comments:

Malcolm Kirkpatrick said...

Hunger or affluence? Both. When times are good, people get together, feast, make music, dance, pair off and reproduce. The population grows. Times get lean. People get together, choose sides, pick up sticks and rocks and kill until there's enough to go around.
Our evolutionary history leaves its mark. Envy is not adaptive in the mass society of the last 5,000 years. Before that? If the best hunter in my 30-member extended family looks like Tom Selleck and sings lile Sam Cook, the only way I get my genes into the next generation is to arrange a little accident. If 100 potters in my city od 20,000 farmers and tradesmen make prettier pots that I do, the homicide strategy won't work; I will get caught and killed before I make a big enough dent in the competition to affect my reproductive odds.
We have way more toxin-screening capacity that we need to process a safe diet. You can live with one kidney. Youcan donate 1/4 of your liver. Apparently we scavenged many ripe carcases and chanced many strange fungi in our time on Earth.

Anonymous said...

This is a good blog. I would just add that half of foraging children die before reaching adulthood, and something like one in ten childbirths leads to the death of the mother.

-- Les Brunswick