Tuesday, April 22, 2014

The Case For (and Against) Life After Death (4): Neuroscience, Consciousness, and Free Will

According to D'Souza, neuroscience shows that life after death is both possible and reasonable.
"Neuroscience reveals that the mind cannot be reduced to the brain, and that reductive materialism is a dead end.  The whole realm of subjective experience lies outside its domain, and outside the domain of objective science altogether.  Two features of the mind--specifically consciousness and free will--define the human soul.  These features seem to operate outside the laws of nature and therefore are not subject to the laws governing the mortality of the body.  The body dies, but the soul lives on." (Life After Death, 220)
There are three arguments here, which D'Souza elaborates in his book. The first is that while neuroscience can explain the objective physical reality of the brain (the structure and functioning of the brain's neural circuitry), it cannot explain the subjective mental reality of the mind (thoughts, emotions, decisions, and so on).  My brain states can be objectively observed by others.  But my mental states cannot, because they are private experiences that only I have.  Other people cannot have the direct access to my mental states that I have.  My brain states are located in space.  But my mental states are not.  My mental states can intentionally refer to things external to them--they are "about" something.  But my brain states don't refer to anything outside themselves.  I can be infallibly sure about my mental states: my thoughts might be mistaken, but I know they are my thoughts.  But I cannot be infallible about my brain states.  In all of these ways, brain states and mental states differ. Thus they cannot be identical, and mind cannot be reduced to brain.  Any attempt to reduce the mind to the brain is implausible, because it denies the self-evident subjective reality of mental experience. 

It is true that neuroscience can show that brain states and mental states are correlated, but that does not show that brain states cause mental states.  It could be that the brain receives or channels the mind, analogous to a radio receiving signals that are translated into sounds.

If this is so, then it's at least possible that mind could live on after the death of the brain.

The second argument is that consciousness has no physical or scientific explanation, because the subjective experience of self-awareness cannot be objectively observed.  Consciousness is an irreducible element of reality like matter and energy.  Even a scientific materialist like Steven Pinker must admit that "there is no scientific explanation of consciousness."

The third argument is that free will is another mysterious feature of mental experience that cannot be explained through natural laws of physical causality.  We all know that we have the power to make mental decisions and then execute those decisions through our brains and bodies.  Neuroscientists recognize this as neuroplasticity--that is, the mind can change the brain.  For example, people suffering from obsessive compulsive disorder (OCD) can be taught through cognitive therapy how to refocus their minds away from the compulsion and redirect their thoughts towards more desirable behavior.  When they succeed in doing this, neuroscientists can see that these people are changing the neurocircuitry of their brains.  This free will seems to be free from the causal determinism of the physical brain and body, and therefore it seems reasonable that the mental capacity for free will could live on after the death of the brain and body.

D'Souza rightly notes that some of the founders of modern neuroscience--such as Charles Sherrington, Wilder Penfield, and John Eccles--were dualists who were open to the possibility of the mind existing as a spiritual reality separated from the body.

I agree that mental experience, self-consciousness, and free will are all mysterious in their correlation with the brain and body, because it's mysterious as to exactly how the brain acts on the mind or the mind acts on the brain. 

We can explain the emergent evolution of the mind or soul by saying that with the increasing size and complexity of the primate brain--and particularly the higher parts of the brain in the prefrontal cortex that support conscious deliberation and choice--the human mind emerged once the human brain passed over a critical threshold of size and complexity.  This still leaves us with a mystery because we do not now know exactly how the brain creates a mind that can then act on the brain itself.

The radical dualism of Kant and D'Souza--the claim that mind belongs to a transcendent world of spirit beyond the natural world of bodies--does not resolve this mystery.  Rather, it tries to overcome one mystery with an even deeper mystery--the mystery of how a transcendent world interacts with our natural world.  D'Souza even admits that if we adopt his dualism, "we still won't be able to fully understand how minds interact with bodies" (127).

And even if we recognize the mystery in this interaction of minds and bodies, it does not follow logically from this mystery that we must believe that minds without any interaction with living bodies can live forever in Heaven or Hell.

Monday, April 21, 2014

The Case For (and Against) Life After Death (3): Kantian Dualism

As I have often argued on this blog, there is a fundamental opposition between Darwinian naturalism and Kantian dualism. 

When Darwin turns to the moral sense in The Descent of Man (in chapter 4), he indicates "that of all the differences between man and the lower animals, the moral sense or conscience is by far the most important" (Penguin edition, 120).  He then quotes from Immanuel Kant: "Duty!  Wondrous thought, that workest neither by fond insinuation, flattery, nor by any threat, but merely by holding up thy naked law in the soul, and so extorting for thyself always reverence, if not always obedience; before whom all appetites are dumb, however secretly they rebel; whence thy original?"  Darwin then writes: "This great question has been discussed by many writers of consummate ability; and my sole excuse for touching on it, is the impossibility of here passing it over; and because, as far as I know, no one has approached it exclusively from the side of natural history."

Darwin's quotation of Kant is from his Critique of Practical Reason (AA, p. 86).  Immediately after the quoted sentence, Kant says that moral duty shows us "man as belonging to two worlds"--a phenomenal world of natural causes and a noumenal world of human freedom.  Apparently, Darwin does not accept this Kantian dualism, because he proposes to explain moral duty "exclusively from the side of natural history," and thus he implicitly rejects Kant's claim that human morality belongs to a transcendental world beyond the natural world.

In her review of Darwin's Descent, Frances Cobbe complained that Darwin's denial of Kantian dualism and of the cosmic transcendence of human morality would promote moral nihilism.  She also worried that this would deny life after death.  If we were to carefully study people who are dying, she argued (in "The Peak in Darien"), we could see that some of them give us a fleeting glimpse of the transition from this world to the next world.

Like Cobbe, Dinesh D'Souza adopts Kantian dualism as the ground for his defense of life after death.  He is explicit about this both in chapter 9 of Life After Death: The Evidence and chapter 15 of What's So Great About Christianity (Regnery, 2007). 

D'Souza uses Kantian dualism to refute empirical realism.  "Empirical realism is based on a premise that many people would consider obvious: there is a real world out there, and we come to know it objectively through our senses and through scientific testing and observation.  This is sometimes called the correspondence theory of truth, because it presumes a correspondence between the real world and our sensory and intellectual apprehension of that world" (Life, 148).  He needs to refute this empirical realism so that he can argue that the empirical world--the world that we know by natural experience and reasoning--is not the only world, because there is a supernatural or transcendental world that is beyond the limits of reason.

He begins by asking what he takes to be the fundamental question for modern Western philosophy: "How do we know that the representations of reality that we have in our minds correspond to reality itself?" (Life, 149).  His Kantian answer is that we don't know this.  We have no way to prove that our subjective mental experiences correspond to the objective material world.

The fallacy of empirical realism, D'Souza contends, is the failure to see that there is a distinction between experience and reality, because the world as we experience it does not always correspond to the world as it really is.  George Berkeley was right: "The only things we perceive are our perceptions."  Our apprehension of the world depends upon our perceptual apparatus--our five senses and the cognitive system of our brains--which filters our experience.  Whatever cannot be captured by this perceptual and cognitive system cannot be known to us. 

So, for example, we known that dogs, bats, and bees have perceptual capacities beyond ours, and thus we cannot perceive what they perceive.  All animals are limited in what they can perceive by their sensory and cognitive apparatus.

From this, Kant inferred that we live in two worlds--the world as it appears to us (the phenomenon) and the world as it really is in itself (the noumenon).  Our reason is limited in that it knows the phenomenal world but not the noumenal world.  Kant argued for this limit on reason as a way to create room for faith, and this is what D'Souza finds so attractive: Kantian dualism supports religious faith in a transcendent reality that is beyond empirical realism.  If "human reason can never grasp reality itself" (Christianity, 173), as D'Souza says, then human reason cannot judge religious belief in the reality of a transcendent, supernatural world.  "We learn from Kant that within the domain of experience, human reason is sovereign, but it is in no way unreasonable to believe things on faith that simply cannot be adjudicated by reason" (D'Souza, "What Atheists Kant Refute").

In response to Daniel Dennett's claim that many people have refuted Kant, D'Souza answered: "In fact, there are no refutations" (Christianity, 174).  So Kant cant be refuted?  Is it irrefutable that "human reason can never grasp reality itself," that we "see things in a limited and distorted way," and that our "minds have a built-in disposition toward illusion"?

On the contrary, far from being irrefutable, Kant refutes himself.

Consider the following remark by D'Souza: "There are things in themselves--what Kant called the noumenon--and of them we can know nothing.  What we can know is our experience of those things, what Kant called the phenomenon" (Christianity, 171).  How do Kant and D'Souza know this?  If "we can know nothing" of things in themselves, then how do Kant and D'Souza know that there are things in themselves, and that these things in themselves are different from our experience of those things?  If "human reason can never grasp reality itself," then how can the human reason of Kant and D'Souza grasp the reality of the distinction between the noumenal and phenomenal worlds?  Isn't their argument self-refuting?

Kant and D'Souza are sophistical in assuming that by refuting a naïve realism they have refuted empirical realism.  It is naïve to believe that what we know by experience and reason always corresponds exactly and fully to reality.  Of course, our experience and reasoning are fallible in their grasp of reality.  But from that it does not follow that we can never have any grasp of reality in itself.  We can correct the mistakes of our experience and reasoning to strive for an approximate correspondence to reality.  So, for example, we can discover the limits of our sensory apparatus, and we can see that other animals have sensory capacities that we do not have.  We can use our cognitive capacities to infer how the world looks to dogs, bats, and bees.  We can also infer the existence of subatomic particles that are not directly accessible to our senses.  This is what science does.

Moreover, we can see that having evolved for life on earth, we are naturally adapted in our sensory and cognitive capacities for gathering information about our world and responding to it in adaptive ways.  If our mental models of the world had no correspondence to that world, and if we were unable to correct those models to make them correspond at least approximately to that world, we could not have survived and reproduced.

Through our experience and reasoning, and with the assistance of science, we need to probe ever deeper into the inexhaustible depths of the natural world, so that as we reach new levels of reality, we see new mysteries that raise new questions.  There is no need, as Kant and D'Souza insist, to assume that this wonderful world of nature is an illusion that hides the real world that can only be reached by denying reason and experience.  We were not thrown into this natural world from some other world far away.  This natural world is our home because we are naturally adapted live in it and investigate its wonders.

Sunday, April 20, 2014

The Case For (and Against) Life After Death: Near Death Experiences

Do near-death experiences "provide strong support for life after death," as Dinesh D'Souza claims (64)?

That they do is the theme of "Heaven Is For Real" a new movie just released over the Easter season based upon a book of the same title by Todd Burpo and Lynn Vincent.  The book is a best-seller, and the movie is attracting a lot of attention, particularly from Christians and others who see it as confirmation for their belief in Heaven.  The movie is worth seeing even if you're a skeptic.  The little kid who plays Colton Burpo is amazing.

Todd Burpo is a minister at the Crossroads Wesleyan Church in Imperial, Nebraska.  Some years ago, his 4-year-old son--Colton--underwent an emergency appendectomy in which he almost died.  Some months afterward, he casually reported that during the operation he had visited Heaven.  Over time, he gradually offered more details.  He sat in Jesus's lap.  Jesus wears a white robe.  Jesus has a multicolored horse.  Colton met John the Baptist and the Holy Spirit, and they are nice people.  There are lots of angels with wings.  All of the human beings there are young.  So that those who died in old age revert back to around age 30.

Pastor Burpo struggled over whether he should believe this.  But when Colton described meeting Pastor Burpo's grandfather, he was convinced.  His wife Sonja resisted, but even she was convinced when Colton reported seeing in Heaven the sister that had died in his mother's tummy, although Colton had never been told by his parents about this miscarriage.

Burpo's book was published over 7 years after Colton began telling his stories, and apparently it took many years for all the stories to come out.

Although it is not reported in the movie, the book relates that Colton saw the future battle of Armageddon in which Jesus and the good people will defeat Satan and the bad people in a bloody conflict.  Colton saw his father helping to kill the bad people with either a sword or a bow and arrow.

So does this prove life after death in Heaven, or was this a hallucination induced by medical trauma?

If Colton had heard nothing about his mother's miscarriage, but discovered his sister in Heaven, that would be impressive.  And yet, is it possible that he overheard his parents speaking about the miscarriage, or otherwise figured this out?  In the movie, we see that Sonja has kept some baby clothes that she bought before the miscarriage.  Is it possible that Colton understood that she was grieving for a lost child?  We are left wondering.

We also wonder about the coherence of Colton's story.  He reports that human beings in Heaven are all young adults, even those who died in old age.  But he also reports that the miscarried foetus of his sister now lives in Heaven as a young child, so she has grown up to the age of a young child, but apparently she will not grow older.

Can't we explain Colton's description of Heaven as the imaginative construction of a child's mind that has been shaped by growing up in a Christian household of a Methodist minister?  If he had been growing up as a Hindu child in India, wouldn't he have told a different story?

We also notice that Todd and Sonja had become so poor that they could not pay their debts.  And so we wonder whether the prospect of writing a best-selling book might have motivated them--even if subconsciously--to embellish the story to make it engaging for readers and thus profitable.

These are the kinds of questions that come up in considering such reports of near-death journeys to the afterlife.

Raymond Moody, a young medical student, coined the phrase "near-death experience" (NDE) in his book Life After Life, which was first published in 1975, and which has sold millions of copies around the world.  He told many stories of people who were resuscitated from death and then reported that they had left their bodies.  They flew upward to the ceilings of their hospital rooms and looked down at their own bodies being worked on by doctors and nurses.  These patients often described moving  through a dark tunnel towards a brilliant light, and then passing over a threshold into a transcendent realm of peace and bliss that seemed heavenly.  Many of them reported seeing God or Jesus or other divine and angelic figures.

Moody and others fascinated by this apparent evidence of a transcendent world of life after death founded the International Association for Near-Death Studies, which publishes a peer-reviewed journal for scientific research in this area--the Journal of Near-Death Studies.   One of the best surveys of this research is The Handbook of Near-Death Experiences: Thirty Years of Investigation, edited by Janice Miner Holden, Bruce Greyson, and Debbie James (Santa Barbara, CA: Praeger Publishers, 2009).

Most of the researchers in this area are committed to showing that near-death experiences are evidence for a world beyond this world, for a transcendent life after death, and thus as confirming supernaturalism and refuting scientific materialism, because this shows that the mind or soul lives on after the death of the brain and the body.  Some of the researchers doubt this, however, because they see near-death experiences as hallucinations of the brain under life-threatening stress.  The arguments of these skeptics are well stated by Keith Augustine's "Hallucinatory Near-Death Experiences" (2008), which is available online.

If we're looking for empirical evidence that NDEs show human minds operating totally independently of the body, and thus that minds can survive the death of the body, then we have to be interested in what Janice Miner Holden (in "Veridical Perception in Near-Death Experiences," in The Handbook, 185-211) calls "apparently nonphysical veridical NDE perception (AVP)":  "In AVP, NDErs report veridical perception that, considering the positions and/or conditions of their physical bodies during the near-death episodes, apparently could not have been the result of normal sensory processes or logical inference--nor, therefore, brain mediation--either before, during, or after these episodes.  Thus, AVP suggests the ability of consciousness to function independent of the physical body" (186).

One of the most commonly cited cases of AVP is the story of a NDEr named Maria, which is related this way by D'Souza:
"Another remarkable case involved a Seattle woman who reported a near death experience following a heart attack.  She told social worker Kimberly Clark that she had separated from her body and not only risen to the ceiling but floated outside the hospital altogether.  Clark did not believe her, but a small detail the woman mentioned caught her attention.  The woman said that she had been distracted by the presence of a shoe on the third floor ledge at the north end of the emergency room building.  it was a tennis shoe with a worn patch and a lace stuck under the heel.  The woman asked Clark to go find the shoe.  Clark found this ridiculous because she knew that the woman had been brought into the emergency room at night, when she could not possibly see what was outside the building, let alone on a third-floor ledge.  Somewhat reluctantly, Clark agreed to check, and it was only after trying several different rooms, looking out several windows, and finally climbing out onto the ledge that she was able to find and retrieve the shoe" (63-64).

Many of the leading NDE researchers have relied on this case as demonstrative evidence for how a mind can float separated from a body.  For example, Kenneth Ring and Madelaine Lawrence have said: "Assuming the authenticity of the account, which we have no reason to doubt, the facts of the case seem incontestable.  Maria's inexplicable detection of the inexplicable shoe is a strange and strangely beguiling sighting of the sort that has the power to arrest the skeptic's argument in mid-sentence, if only by virtue of its indisputable improbability" ("Further Evidence for Veridical Perception During Near-Death Experiences," Journal of Near-Death Studies, 11 [Summer 1993]: 223).

But then, in 1996, Hayden Ebbern, Sean Mulligan, and Barry Beyerstein reported that Clark's story of Maria was inaccurate.  Their article ("Maria's Near-Death Experience: Waiting for the Other Shoe to Drop," The Skeptical Inquirer,  20 [July/August 1996]: 27-33) is available online.

Maria's NDE occurred in 1977 at Seattle's Harborview Medical Center.  It was reported by Clark in 1984.  In 1994, Ebbern and Mulligan visited the hospital to survey the site and interview Clark.  They discovered that Marie had disappeared.  To test the story of the shoe, they placed a running shoe at the place indicated by Clark.  When they went outside the hospital, they could easily see the shoe.  They also discovered that the shoe was easily seen from inside the room of the hospital.  Since the shoe was easily visible both outside and inside the hospital, Maria could have seen the shoe, or she could have overheard people talking about this strange shoe on the ledge.  Clearly, Clark had embellished the story to make it look like an astonishing confirmation of AVP.

When D'Souza tells the story of Maria, he's completely silent about this debunking of Clark's report.  If you look at the video of D'Souza's debate with Dan Barker, you'll see that Barker points this out, and D'Souza has nothing to say in response.

We might wonder whether researchers have found better evidence for AVP that stands up to scrutiny.  In her survey of the research on AVP, Holden indicates that the most conclusive proof for AVP could come from field studies in hospitals, where researchers could plant visual targets in hospital rooms so that no one could see what's on the target unless they were floating around the ceiling.  So if an NDEr could report seeing what's on the target, that would show that a disembodied soul can see without any activity of the brain to support vision. 

Holden reports that there have been only five studies that satisfy the difficult conditions for such research.  "The bottom line of findings from these five studies," she concludes, "is quite disappointing: No researcher has succeeded in capturing even one case of AVP" (209).

She quotes a remark from Kenneth Ring about how discouraging this is:
"There is so much anecdotal evidence that suggests [experiencers] can, at least sometime, perceive veridcally during their NDEs . . . but isn't it true that in all this time, there hasn't been a single case of a veridical perception reported by an NDEr under controlled conditions?  I mean, thirty years later, it's still a null class (as far as I know).  Yes, excuses, excuses--I know.  But, really, wouldn't you have suspected more than a few such cases at least by now?" (210)

D'Sousa is silent about this failure of the most committed researchers to find demonstrative evidence that near-death experiences show how human minds can perceive reality accurately without any support from the body or the brain.

Saturday, April 19, 2014

The Case For (and Against) Life After Death

I have written a series of posts on the evolution of Heaven and Hell (in April and May of 2010) and on the various forms of immortality (in October and November of 2013).  Although I have been generally skeptical about life after death, I recognize that there are good arguments for believing in such a possibility. 

The best statement of those arguments that I have seen is Dinesh D'Souza's Life After Death: The Evidence (Regnery Publishing, 2009).  What is most interesting for me is that D'Souza claims to rely primarily on purely rational scientific and philosophic thinking that does not depend on religious faith.  This is the first of a series of posts on D'Souza's arguments. 

In my responses to D'Sousa, I have been influenced by Victor Stenger's book chapter "Life After Death: Examining the Evidence," in The End of Christianity, edited by John W. Loftus (Prometheus Books, 2011), 305-32, which is available online.  There is a series of YouTube videos of a debate between D'Souza and Dan Barker, which lays out some of the issues.

According to Stephen Cave, there are four possible ways that we might achieve immortality.  The Staying Alive Narrative says that we could become immortal if we could find a way to postpone death indefinitely.  The Resurrection Narrative says that even if death is unavoidable, we might be brought back to life.  The Soul Narrative says that even if our bodies must die, our souls can live forever because they are immaterial and thus not subject to bodily decay, and our souls are the most essential part of us.  Finally, the Legacy Narrative says that we can live on after death through those that live after us--either because they remember us or because they carry our genes.

D'Souza says nothing about the Staying Alive Narrative and quickly dismisses the Legacy Narrative (3-4).  So he's left with the Soul Narrative and the Resurrection Narrative.  He recognizes that the Soul Narrative came from Plato and was adopted by Christian theologians like Augustine, while the Resurrection Narrative was introduced into Christianity by Paul in the New Testament.  D'Souza never clarifies the relationship between these two forms of immortality. 

According to the orthodox Christian tradition defended by someone like Thomas Aquinas, our souls are separated from our bodies at death, but then they must be reunited with our bodies at the Second Coming of Christ and the Last Judgment, when the saved will go to Heaven for eternal reward, and the damned will go to Hell for eternal punishment.

Furthermore, according to Aquinas, this resurrected body must be a real living body. And since all living bodies are ageing bodies, the resurrected bodies must have a specific age. Since Jesus rose again at about age 30, that age must be the perfect age for the body, and so, Aquinas reasons, when human beings are resurrected, they will all have bodies of the same age--30 years old. Those who died as children will be moved up to age 30, and those who died in old age will be moved back to age 30 (ST, suppl., q. 81, a. 1).

But then we must wonder, when people wish for immortality, is this what they're wishing for--to be frozen eternally at one moment in time?  Shouldn't we say that this kind of immortality would be death?

This points to the problem of personal identity.  If my wish for immortality is a wish that I as a unique individual with a unique personal identity should live forever, then I want to be sure that whatever lives forever is really me and not just a copy of me.  As the embodied person that I am, my experience of myself combines my ageing body and my self-conscious mind as inseparable.  So it's not clear to me that an immaterial soul would really be me.  It's also not clear to me that a resurrected body would really be me if that body was frozen at the age of 30. 

D'Souza never faces up to this problem.  For example, he speaks about the Buddhist conception of immortality in which we must realize that "our individual souls are identical with the oneness of ultimate reality."  He explains: "Part of our enlightenment is to recognize that the very concept of 'I' is illusory; in reality, there is no man behind the curtain.  The term nirvana literally means 'blowing out,' and in case you're wondering, you are the one who must be blown out, like a candle" (51).  Is this what people wish for when they wish for immortality--to have their personal identity blown out?  If one's identity is blown out, isn't that death?

There's a similar problem with bodily  resurrection.  D'Souza indicates that our resurrected bodies will have to be very different from the earthly bodies that we have now, because our resurrected bodies will have to be eternally imperishable and ageless.  But if my resurrected body is so different from the body that I have known in my life, will this be my body?  Or will it be only a copy of my body?  Here, again, it seems that my personal identity as the real embodied mind that I am has been blown out.

As I have indicated in my blog post on Wallace Stevens's poem "Sunday Morning," we should consider the possibility that living forever is not desirable, because living timelessly and changelessly would not be really living, and that in living the lives that we have, "death is the mother of beauty."

Monday, April 14, 2014

Does Pinker Show the Bias of a Pro-Western Imperialist, Capitalist, Elitist, and Anti-Communist Ideology?

Edward S. Herman and David Peterson have written one of the most elaborate critiques of Steven Pinker's Better Angels of Our Nature.  It's available online as an ebook--Reality Denial: Steven Pinker's Apologetics for Western-Imperial Violence (2012, 144 pages).  A short excerpt from their book has been published as a book review in the International Socialist Review (November-December, 2012).

As Rousseauean leftists, Herman and Peterson believe that our nomadic hunter-gatherer ancestors in the state of nature lived happily as peaceful egalitarians, but that this happy life was lost with the establishment of a sedentary life based on farming that eventually allowed for the sociopolitical complexity of bureaucratic states that brought all of the evils of modern life: "class structures, divisions of labor and social status, concentrations of wealth and poverty, and hierarchies of power and subordination, including religious and military power structures--all of the sins still very much with us in the modern world" (72).  They must consequently scorn Pinker as a classical liberal ideologue who wants to see a progressive history of declining violence and increasing liberty that began with the transition out of a Hobbesian state of nature among hunter-gatherers and that has culminated in the modern liberal peace.

As one manifestation of Pinker's ideological bias, Herman and Peterson point out that Pinker refuses to recognize that Western capitalist states wage imperial wars of conquest.  They quote Pinker as saying that not only do "democracies avoid disputes with each other," but that they "tend to stay out of disputes across the board," which is called the "Democratic Peace" (Pinker, 283).  They remark: "This will surely come as a surprise to the many victims of U.S. assassinations, sanctions, subversions, bombings and invasions since 1945.  For Pinker, no attack on a lesser power by one or more of the great democracies counts as a real war or confutes the 'Democratic Peace,' no matter how many people die" (Herman and Peterson, 9).

They also quote Pinker as saying: "Among respectable countries, conquest is no longer a thinkable option.  A politician in a democracy today who suggested conquering another country would be met not with counterarguments but with puzzlement, embarrassment, or laughter" (Pinker, 260).  They respond: "This is an extremely silly assertion.  Presumably, when George Bush and Tony Blair sent U.S. and British forces to attack Iraq in 2003, ousted its government, and replaced it with one operating under laws drafted by the Coalition Provisional Authority, this did not count as 'conquest,' as these leaders never stated that they launched the war to 'conquer' Iraq" (Herman and Peterson, 9).

Herman and Peterson don't indicate to their readers that Pinker's comments about the "Democratic Peace" are part of a summary of the research of Bruce Russett and John Oneal (Triangulating Peace: Democracy, Interdependence, and International Organizations [Norton, 2001]), who used the statistical technique of multiple logistic regression to analyze more than 2,300 militarized interstate disputes between 1816 and 2001, and who concluded "not that democracies never go to war . . ., but that they go to war less often than nondemocracies, all else being equal" (Pinker, 281).  Herman and Peterson don't point out any mistakes in the research of Russett and Oneal.   Indicating that democracies sometimes do fight wars does not refute the claim that they tend to go to war less often.

Pinker's comment that "conquest is no longer a thinkable option" comes in the context of his summary of Mark Zacher's research ("The Territorial Integrity Norm: International Boundaries and the Use of Force," International Organization 55 [2001]: 215-50).  Zacher has shown that since World War Two, there has been an international norm favoring the freezing of national borders.  As compared with previous centuries, the percentage of territorial wars that resulted in a redistribution of territory has dropped dramatically.  The recent international protest against Russia's acquisition of the Crimea is an illustration of this new international norm.  Herman and Peterson don't point out any mistakes in Zacher's research.  Instead, they cite the example of the invasion of Iraq by U.S. and British forces in 2003 as a conquest of that country.  But since there has been no change in the national borders of Iraq, it's not clear how this refutes Zacher's work.

Pinker argues that since 1945 there has been a "Long Peace"--the longest period in modern history in which the Great Powers have not fought a war with one another.  Herman and Peterson seem to agree with this, at least partially: "the First and especially the Second World War had taught them that with their advancing and life-threatening means of self-destruction, they could not go on playing their favorite game of mutual slaughter any longer.  But this didn't prevent them from carrying out numerous and deadly wars against the Third World, which filled-in the great-power war-gap nicely."  And, furthermore, the Long Peace is "increasingly threatened by a Western elite-instigated global class war and a permanent-war system" (92).  Herman and Peterson claim that Pinker ignores the "increasing structural violence of a global class war," in which capitalist nations have created a global economic system that allows them to exploit the poor nations (11, 62, 75).

According to Herman and Peterson, Pinker ignores the "structural violence" inherent in global capitalism because of his pro-capitalist and anti-communist bias.  An example of this is what he says about Mao Zedong's responsibility for the Great Famine during China's Great Leap Forward (1958-1961).  They quote Pinker as saying that "Mao masterminded . . . famine that killed between 20 million and 30 million people" (Pinker, 331).  For Pinker this shows the evil in communist ideology, because Mao's communism was responsible for the second worst atrocity in human history (second only to World War Two).  But Herman and Peterson insist that while Mao made a few mistakes, his communist policies were generally successful in improving the lives of the masses, and that life in China has become much worse under the influence of the capitalist reforms in China that began in 1979.

They write:
"China's death rate increased after 1979, with the surge of capitalist reforms and the associated sharp reduction in public medical services.  A recent review of China's past and current demographic trends showed that its rate of death was higher in 2010 than in 1982, and that the greatest declines in mortality occurred well prior to the reforms, with a national decline occurring even during the decade that included the famine (1953-1964)."
"So Pinker misrepresents the truths at a number of levels in dealing with the Chinese starvation episode.  He avoids the need to reconcile allegedly deliberate starvation deaths with a prior and continuous Chinese state policy of helping the masses by simply not discussing the subject.  He ignores the evidence that policy failure and ignorance rather than murderous intent was the source of those deaths.  He fails to mention the rise in mortality rates under the post-Mao new capitalist order." (60)

The reference here to a "recent review" of Chinese demographic trends is to an article by Xizhe Peng ("China's Demographic History and Future Challenges," Science 333 (29 July 2011): 581-87).  Herman and Peterson do not note Peng's warning that "there are widespread concerns in the scientific community regarding the quality of some of these population data" (581).  They are also silent about his statement that "the period 1959-1961 witnessed an exceptional demographic fluctuation mainly attributable to the great famine, with more than 20 million excess deaths" (581).

It is true, as they say, that Peng reports a slight increase in the death rate (per 1,000) from 6.6 in 1982 to 7.1 in 2010.  But what they don't say is that Peng reports that the death rate after 1979 was much less than in 1953 (14.0) or 1964 (11.6).  Furthermore, they are silent about Peng's reporting that life expectancy has been increasing and illiteracy has been declining since the capitalist reforms began in 1979.

Herman and Peterson quote from Jean Dreze and Amartya Sen (Hunger and Public Action [Oxford, 1989]) in explaining the Great Famine.  But they are silent about the judgment of Dreze and Sen that after 1979 "there is little doubt that the Chinese economy has surged ahead in response to market incentives, and the agricultural sector has really had--at long last--a proper 'leap forward'" (215).

Herman and Peterson are also silent about the growing evidence in recent years as to the brutality of the Great Famine and Mao's responsibility for it.  Based upon archival material in China that has only recently been opened to study, Frank Dikotter (in Mao's Great Famine: The History of China's Most Devastating Catastrope, 1958-1962 [Walker Publishing, 2010] concludes that at least 45 million people died unnecessarily between 1958 and 1962, and "the widespread view that these deaths were the unintended consequence of half-baked and poorly executed economic programs" is wrong.  He explains:
"As the fresh evidence presented in this book demonstrates, coercion, terror and systematic violence were the foundation of the Great Leap Forward.  Thanks to the often meticulous reports compiled by the party itself, we can infer that between 1958 and 1962 by a rough approximation 6 to 8 per cent of the victims were tortured to death or summarily killed--amounting to at least 2.5 million people.  Other victims were deliberately deprived of food and starved to death. . . . People were killed selectively because they were rich, because they dragged their feet, because they spoke out or simply because they were not liked" (xi).
Furthermore, Dikotter observes: "We know that Mao was the key architect of the Great Leap Forward, and thus bears the main responsibility for the catastrophe that followed.  he had to work hard to push through his vision, bargaining, cajoling, goading, occasionally tormenting or persecuting colleagues" (xiii).   He also concludes that "the catastrophe unleashed at the time stands as a reminder of how profoundly misplaced is the idea of state planning as an antidote to chaos" (xii).

This is a critical issue for Pinker's argument because his claim is that it's classical liberal thought that promotes declining violence, and that most of the atrocious violence of the 20th century was due to the illiberal regimes led by three individuals--Stalin, Hitler, and Mao.  Matthew White has calculated that the total death toll from communism in the 20th century is around 70 million, which would make the communist movement responsible for the greatest atrocity in human history (The Great Big Book of Horrible Things, Norton, 2012, pp. 453-57).

To make their case against Pinker, Herman and Peterson would have to demonstrate that this is not true.

Sunday, April 13, 2014

Pinker's List: A Distorted Record of Prehistoric War?

In his Nobel Peace Prize Acceptance Speech in 2009, President Barack Obama had to justify the awarding of the Nobel Peace Prize to a Commander in Chief who was leading his country in two major wars.  He argued that war is so deeply rooted in human nature and the human condition that it can never be completely abolished.  He declared: "War, in one form or another, appeared with the first man."  And yet, explaining how we can and should strive for peace, he quoted from President John Kennedy: "Let us focus on a more practical, more attainable peace, based not on a sudden revolution in human nature but on a gradual evolution in human institutions."  He then repeated that last phrase--"a gradual evolution of human institutions"--as the theme for his speech.  Without trying to change human nature, we can promote peace through institutional evolution--through culturally evolved norms of just war, human rights, global commerce, and international sanctions for punishing unjustified violence.  Obama thus summarized the argument of Steven Pinker that while war and violence express the "inner demons of our nature," we can move towards a life of peaceful coexistence as long as our cultural environment strengthens the "better angels of our nature."

Some of the critics of Pinker's argument think this is deeply mistaken because of its false claim that war has roots in human nature.  For example, in his book chapter--"Pinker's List: Exaggerating Prehistoric War Mortality"--R. Brian Ferguson challenges Pinker's evidence for prehistoric war that would support Obama's claim that "war, in one form or another, appeared with the first man."  (Ferguson's chapter appears in War, Peace, and Human Nature: The Convergence of Evolutionary and Cultural Views, edited by Douglas Fry [Oxford University Press, 2013].  A copy is available online.)

Ferguson concentrates his attention on Pinker's Figure 2-2 (page 49), which presents a list of societies showing the percentage of deaths in warfare in nonstate and state societies, classified into four groups: prehistoric archaeological sites, hunter-gatherers, hunter-horticulturalists and other tribal groups, and states.  The bar graphs show that the percentage of deaths in war is much higher for the first three groups than it is for states (ranging from ancient Mexico before 1500 CE to modern states from the 17th century to the present).

Ferguson claims that if one looks at the original sources for this data cited by Pinker, one discovers that Pinker's visual graph distorts the data to make it appear more supportive of his argument than it really is.  First, one should notice that among the 21 groups of prehistoric gravesites, the oldest archaeological site (Gobero, Niger, 14,000-6,200 BCE) has no war deaths at all.  And a couple of the prehistoric sites (Sarai Nahar Rai, India, 2140-850 BCE, and Nubia, 12,000-10,000 BCE) have only one violent death each.  If three skeletons are found at a site, and one of them shows evidence of violent death, then Pinker presents this as a bar graph showing 33% of deaths in war, which is much higher than that for modern states.  Surely, Ferguson suggests, one violent death at one gravesite hardly shows extensive warfare, but Pinker does not explain this to his reader.  Moreover, Ferguson notes, one set of 30 sites is from British Columbia, 3,500 BCE to 1674 CE.  Although he concedes this evidence for warfare, Ferguson indicates that these Indians along the Pacific Northwest Coast were "complex" hunter-gatherers--that is, hunter-gatherers who had settled into large villages with some hierarchical social structures, which was not characteristic of the nomadic hunter-gatherers who were our original ancestors.

Pinker presents bar graphs showing a range of 5% to 60% deaths in warfare for 8 hunter-gatherer societies.  But Ferguson points out that Pinker does not tell his reader that for two of these societies (the Ache of Paraguay and the Hiwi of Venezuela-Columbia), all of the war deaths were indigenous people killed by frontiersmen.

Pinker's bar graphs for 10 societies of hunter-horticulturalists and other tribal groups show a range of 15% to 60% deaths in warfare.  The 60% rate of death in war is the highest rate ever recorded by anthropologists, and it's for the Waorani of Eastern Ecuador.  When I was travelling through the Ecuadorian rainforest last summer, I heard about the Waorani and their reputation for violence.  One of my Quichua guides identified them as auca--"savages."

Ferguson concedes that the archaeological and anthropological evidence shows intense warfare among many complex hunter-gatherers and horticulturalists, but he argues that nomadic hunter-gatherers would not have shown this.  When one sees evidence of one or a few violent deaths among a group of nomadic hunter-gatherers, this should be identified as homicide not war.

Like Douglas Fry, Ferguson agrees that there has been lethal violence among nomadic hunter-gatherers, but this was personal violence rather than war.

In defense of Pinker, one could argue for Richard Wrangham's distinction between "simple" and "complex" war.  Like chimpanzees, nomadic hunter-gatherers do not fight pitched battles under the formal command of military leaders, because such "complex" warfare arises only in agrarian societies with military and political hierarchies.  Nomadic hunter-gatherers will kill members of outside groups only when the killers can surprise their outnumbered victims and then retreat after killing only a few individuals.  This raiding and feuding will not result in large numbers of battle deaths, and thus the archaeological record will not show any evidence of large numbers of violent deaths among nomadic hunter-gatherers.  Moreover, Pinker and Wrangham would predict that violent raiding and feuding among hunter-gatherers is infrequent, with long periods of peace, although the rate of killing is still comparable to that of American cities today.

Ferguson concludes: "We are not hard-wired for war.  We learn it" (126).

He does not indicate that those like Pinker and Wrangham actually agree with him about this.  They agree that war is not a biological necessity, although there are biological propensities to violence that can be triggered by the social environment.  They also agree with Ferguson that the establishment of agrarian societies with bureaucratic states created "complex" warfare as a purely cultural invention.  They also agree that the cultural evolution of recent centuries can move us towards peace.

Pinker and Wrangham agree with Obama and Kennedy:  in the quest for peace, we need not a sudden revolution in human nature but a gradual evolution in human institutions.

Some of these points are developed in earlier posts here, here, here., here, and here.

ADDENDUM
Brian Ferguson has pointed out to me that I have made a mistake here in attributing to him the point about the Hiwi and the Ache, because Ferguson only deals with the archaeological data in Pinker's list.  Actually, the point about the Hiwi and Ache was made by Douglas Fry (17-18).

Friday, April 11, 2014

Does Steven Pinker Distort the Data for Declining Violence?

Steven Pinker's Better Angels of Our Nature has over 115 figures--an average of one for every 6 pages of text.  Many of these figures are visual presentations of data to support his argument for a historical trend towards declining violence from the Stone Age to the present.  These figures are based on data found in thousands of cited sources.  This is one of his most impressive rhetorical techniques for persuading his readers that his reasoning is based on a meticulous statistical analysis of data.

Most readers will not take the trouble to read the sources for each figure to see whether Pinker is being accurate in his presentation of the data.  But some of his critics have done this for some of the figures, and they are accusing Pinker of manipulating the data to make it look more supportive of his argument that it really is.  Having looked into this myself, I think this is a fair criticism, although it's not fatal to his argument.  If Pinker had been totally honest about the gaps and uncertainties in the data, he could still have made a plausible argument for his conclusions.

Here I'll point to two examples: the table on page 195 that ranks the greatest atrocities in human history and Figure 2-2 on page 49 that shows the "percentage of deaths in warfare in nonstate and state societies."

Pinker identifies his table of the greatest atrocities as taken from Matthew White's list of "(Possibly) The Twenty (or so) Worst Things People Have Done to Each Other."  White identifies himself as an "atrocitologist" who for many years has maintained a website where he compiles records of the greatest atrocities in human history based on his estimates of violent deaths drawn from historical sources.  This work has been published as a book--The Great Big Book of Horrible Things: The Definitive Chronicle of History's 100 Worst Atrocities (Norton, 2012)--with a Foreword by Pinker.

In Pinker's table, he says that he's following White in ranking the 21 worst atrocities.  Number 1 is the Second World War with a death toll of 55 million.  Number 2 is Mao Zedong who was responsible for a death toll of 40 million (mostly through a government caused famine).  Number 3 is the Mongol Conquests of the 13th century with a death toll of 40 million.  Number 4 is the An Lushan Revolt in China in the 8th century with a death toll of 36 million.

This seems to confirm the common belief that the 20th century was the bloodiest in human history, especially when one notices that 5 of the top 21 atrocities were in the 20th century; and this would seem to refute Pinker's theory of a historical trend of declining violence.  In fact, White concludes his book by identifying the bloody events of the first half of the 20th century as the "Hemoclysm" (Greek for "blood flood"), which he sees as a series of interconnected events stretching from the First World War to the deaths of Hitler, Stalin, and Mao.  The collective death toll here would be 150 million, which would make it the other Number 1 atrocity of human history.

If Pinker is to save his theory of declining violence, he must reinterpret White's account of the historical record of violence culminating in the Hemoclysm of the 20th century.  Pinker does this with three arguments.

His first argument is that we must adjust White's numbers to overcome the illusion that the 20th century was much bloodier than past centuries.  Pinker adjusts the absolute numbers of violent deaths, and he also asks us to look at the relative numbers, calculated as a proportion of the populations.  Once these adjustments are made, Pinker can conclude that "the worst atrocity of all time was the An Lushan Revolt and Civil War, an eight-year rebellion during China's Tang Dynasty that, according to censuses, resulted in the loss of two-thirds of the empire's population, a sixth of the world's population at the time" (194).  In an endnote to this sentence, Pinker writes: "An Lushan Revolt.  White notes that the figure is controversial.  Some historians attribute it to migration or the breakdown of the census; others treat it as credible, because subsistence farmers would have been highly vulnerable to a disruption of irrigation infrastructure" (707, n. 13).

A reader who notices this endnote might become curious about what White has said about these controversial calculations concerning the An Lushan Rebellion.  A reader who looks at White's book will notice that he revises the estimates of violent deaths--moving from 36 million to 26 million to a final estimate of 13 million.  With the lower estimate, the An Lushan Rebellion ranks Number 13 on the list of atrocities, not Number 4 as Pinker has it, because Pinker accepts the highest estimate of 36 million (White 88-93, 529).

Historians know that the Chinese census recorded a population of 52,880,488 in the year 754, and then after ten years of civil war, the census of 764 recorded a population of 16,900,000.  This would suggest that 36 million people died in the war, which would be two-thirds of the entire population of China.  Pinker accepts these numbers, which allows him to rank the An Lushan Revolt as Number 4 on the list of atrocities.

But White indicates that most historians doubt the accuracy of these numbers, because they suspect that the chaos created by the war had impeded the ability of the Chinese census takers to find every taxpayer.  He cites five historians who commented on the census numbers.  He reports that two of them express "major doubt" about the census numbers, one expresses "slight doubt," one expresses "apparent acceptance," and one expresses "acceptance."  But a reader who checks these sources will see that the doubt is even greater than is reported by White.  The historian whom White identifies as expressing "slight doubt"--Peter Stearns--actually says that the population census of 16,900,000 was "certainly too low," which surely shows "major doubt."  And the historian whom White identifies as expressing "acceptance"--Peter Turchin--actually says there is "a certain degree of controversy among the experts" about the numbers, which surely indicates "slight doubt."

Not only does Pinker depart from White in accepting the 36 million estimate of violent deaths, Pinker also insists that death tolls should be adjusted as a proportion of the populations, because this allows us to judge the relative risk of being killed at different points in history.  The 55 million deaths in World War Two is higher than the 36 million in the An Lushan Revolt, but then the world population at the middle of the 20th century was much larger than that in the 8th century.  So if 36 million violent deaths was a sixth of the world's population in the 8th century, this would be the equivalent of 429 million violent deaths in the middle of the 20th century, which would raise the An Lushan Revolt to Number 1 on the list of atrocities; and World War Two would drop to Number 9 on the list.  White does not adjust the ranking in this way.

Pinker's second argument for why the Hemoclysm of the 20th century does not refute his theory of declining violence is that the causes of war can be so contingent that we can have something like World War Two erupt by chance without altering the otherwise declining trend of violence.  We can thus see World War Two as "an isolated peak in a declining sawtooth--the last gasp in a long slide of major war into historical obsolescence" (192).  If wars start and stop at random, then the accidents of history and the peculiarities of particular individuals can result in cataclysmic spasms of violence (200-222). 

In 1999, there was a lot of discussion about who should be considered the Most Important Person of the 20th Century.  White's answer was Gavrilo Princip.  And who was he?  He was the 19-year-old Serbian terrorist who assassinated Archduke Franz Ferdinand of Austria-Hungary.  This was a lucky accident for Princip.  If the archduke's driver had not made a wrong turn in Sarajevo, this would not have happened, and it's likely that World War One would not have happened, and this would not have set off the series of events leading to Lenin, Stalin, Mao, Hitler, World War Two, and the Cold War (White, 344-58; Pinker, 207-10, 262-63).

In the 80-year-long Hemoclysm sparked by Princip's bullets, three individuals--Stalin, Hitler, and Mao--were responsible for most of the violent deaths.  The communist regimes were responsible for 70 million deaths, which would justify ranking communism as the Number 1 atrocity--even greater than World War Two--except that it's hard to think of the whole communist movement as one event (White, 453-57).  Notice that what we see here is that most of the violence of the 20th century has been caused by illiberal ideology--Nazism and communism.

This supports Pinker's third argument for why the violence of the 20th century does not deny his theory of history.  The historical trend towards decreasing violence and increasing liberty depends on the spreading influence of classical liberal culture based on the principle that violence is never justified except in defense against violence.  That illiberal regimes have been the primary sources of violence in the 20th century confirms Pinker's argument. 

Because of the contingency of history, we can never be sure that illiberal leaders will not arise and cause great disasters.  Some day, we might see another Stalin, or Mao, or Pol Pot.  And that's why Pinker is clear in stating that there is no inevitability in the historical trend towards declining violence, because it could be reversed by illiberal turns (xxi, 361-77, 480).  But insofar as classical liberal ideas and norms spread around the world, they can increase the odds in favor of declining violence, which is what has happened since World War Two.

In my next post, I'll turn to Figure 2-2.

My first long series of posts on Pinker's Better Angels was written from October, 2011, to January, 2012.