the official site of Michael Shermer

top navigation:

Tag Results

The Natural & the Supernatural: Alfred Russel Wallace and the Nature of Science

A couple weeks ago, I participated in an online debate at Evolution News & Views with Center for Science & Culture fellow Michael Flannery on the question: “If he were alive today, would evolutionary theory’s co-discoverer, Alfred Russel Wallace, be an intelligent design advocate?” Before reading this week’s post, you can review my opening statement in my previous Skepticblog and Flannery’s reply. The following is my response. A link to Flannery’s final reply can be found near the end of this page.

Michael Flannery’s assessment of Alfred Russel Wallace as a prescient scientist who anticipated modern Intelligent Design theory is premised on the belief that modern evolutionary biologists have failed to explain the myriad abilities of the human mind that Wallace outlined in his day as unanswered and—in his hyperselectionist formulation of evolutionary theory—unanswerable. In point of fact there are several testable hypotheses formulated by scientists—evolutionary psychologists in particular—that make the case that all aspects of the human mind are explicable by evolutionary theory. Flannery mentions just one—Steven Pinker’s hypothesis that cognitive niches in the evolutionary environment of our Paleolithic hominid ancestors gave rise to abstract reasoning and metaphorical thinking that enabled future humans to navigate complex social and cognitive environments found in the modern world. In his PNAS paper Pinker outlines two processes at work: “One is that intelligence is an adaptation to a knowledge-using, socially interdependent lifestyle, the ‘cognitive niche’.” And: “The second hypothesis is that humans possess an ability of metaphorical abstraction, which allows them to coopt faculties that originally evolved for physical problem-solving and social coordination, apply them to abstract subject matter, and combine them productively.” Together, Pinker concludes: “These abilities can help explain the emergence of abstract cognition without supernatural or exotic evolutionary forces and are in principle testable by analyses of statistical signs of selection in the human genome.” Pinker then outlines a number of ways in which the cognitive niche hypothesis has been and can continue to be tested.

In point of fact, Darwin himself addressed this larger problem of “pre-adaptation”: Since evolution is not prescient or goal directed—natural selection operates in the here-and-now and cannot anticipate what future organisms are going to need to survive in an ever-changing environment—how did certain modern useful features come to be in an ancestral environment different from our own? In Darwin’s time this was called the “problem of incipient stages.” Fully-formed wings are obviously an excellent adaptation for flight that provide all sorts of advantages for animals who have them; but of what use is half a wing? For Darwinian gradualism to work, each successive stage of wing development would need to be functional, but stumpy little partial wings are not aerodynamically capable of flight. Darwin answered his critics thusly:

Although an organ may not have been originally formed for some special purpose, if it now serves for this end we are justified in saying that it is specially contrived for it. On the same principle, if a man were to make a machine for some special purpose, but were to use old wheels, springs, and pulleys, only slightly altered, the whole machine, with all its parts, might be said to be specially contrived for that purpose. Thus throughout nature almost every part of each living being has probably served, in a slightly modified condition, for diverse purposes, and has acted in the living machinery of many ancient and distinct specific forms.1

Today this solution is called exaptation, in which a feature that originally evolved for one purpose is co-opted for a different purpose.2 The incipient stages in wing evolution had uses other than for aerodynamic flight—half wings were not poorly developed wings but well-developed something elses—perhaps thermoregulating devices. The first feathers in the fossil record, for example, are hairlike and resemble the insulating down of modern bird chicks.3 Since modern birds probably descended from bi-pedal therapod dinosaurs, wings with feathers could have been employed for regulating heat—holding them close to the body retains heat, stretching them out releases heat.4

So one testable hypothesis about the various aspects of the mind that so troubled Wallace is that cognitive abilities we exhibit today were employed for different purposes in our ancestral environment. In other words, they are exaptations, coopted for different uses today than that for which they originally evolved. But even if these hypotheses fail the tests new hypotheses will take their place to be empirically verified, rejected, or refined with additional data from the natural world. This is how science operates—the search for natural explanations for natural phenomena.

By contrast, Intelligent Design theorists offer no testable hypotheses at all, no natural explanations for natural phenomena. Instead, their answer to the mysteries of the mind is the same as that of all other mysteries of the universe: God did it. Although their narratives are gussied up in jargon-laden terms such as “irreducible complexity,” “specified complexity,” “complex specified information,” “directed intelligence,” “guided design,” and of course “intelligent design”—these are not causal explanations. They are just linguistic fillers for “God did it” explanations. It is nothing more than the old “God of the gap” rubric: wherever creationists find what they perceive to be a gap in scientific knowledge, this must be where God intervened into the natural world. If they want to do science, however, they must provide testable hypothesis about how they think God (or the Intelligent Designer—ID) did it. What forces did ID use to bring about wings, eyes, and brains? Did ID intervene into the natural world at the level of species or genus? Did ID intervene at the Cambrian explosion or before (or after)? Did ID create the first cells and pack into their DNA the potential for future wings, eyes, and brains? Or did ID have to intervene periodically throughout the past billion years to build bodies one part at a time? And more to the point here, did ID layer on cortical neurons atop older naturally evolved brain structures to enable certain primates to reason more abstractly than other primates?

The reason scientists do not take seriously the claims of Intelligent Design theorists today is the same reason scientists did not take seriously Wallace’s speculations about an “overarching intelligence” that guided evolution. As I noted previously, Wallace’s hyperselectionism and hyperadaptationism blinded him to the possibilities offered in a multi-tiered evolutionary model where the concept of exaptation expands our thinking about how certain features might have evolved for reasons different from what they are used for today. As Wallace’s biographer it is my opinion that he was driven as much by his overarching scientism of which his theory of evolution as pure adaptationism was a part, and that even his spiritualism was subsumed in his scientistic worldview.5

Read Flannery’s final reply in this debate.

References

  1. Darwin, Charles. 1862. On the Various Contrivances by Which British and Foreign Orchids Are Fertilized by Insects, and on the Good Effects of Intercrossing. London: John Murray. p. 348.
  2. Gould, Stephen Jay and Elizabeth Vrba. 1982. “Exaptation: A Missing Term in the Science of Form.” Paleobiology, 8, pp. 4–15.
  3. Prum, R. O. and A. H. Brush. 2003. “Which Came First, the Feather or the Bird: A Long-Cherished View of How and Why Feathers Evolved Has Now Been Overturned.” Scientific American, March, pp. 84–93.
  4. Padian, Kevin and L. M. Chiappe. 1998. “The Origin of Birds and Their Flight.” Scientific American, February, pp. 38–47.
  5. Shermer, Michael. 2002. In Darwin’s Shadow: The Life and Science of Alfred Russel Wallace. New York: Oxford University Press.
Comments Off

Alfred Russel Wallace was a Hyper-Evolutionist, not an Intelligent Design Creationist

A couple weeks ago, I participated in an online debate at Evolution News & Views with Center for Science & Culture fellow Michael Flannery on the question: “If he were alive today, would evolutionary theory’s co-discoverer, Alfred Russel Wallace, be an intelligent design advocate?” The following is my opening statement in the debate. A link to Flannery’s reply can be found near the end of this page.

The double dangerous game of Whiggish What-if? history is on the table in this debate that inexorably invokes hindsight bias, along the lines of “Was Thomas Jefferson a racist because he had slaves?” Adjudicating historical belief and behavior with modern judicial scales is a fool’s errand that carries but one virtue—enlightenment of the past for correcting current misunderstandings. Thus I shall endeavor to enlighten modern thinkers on the perils of misjudging Alfred Russel Wallace as an Intelligent Design creationist, and at the same time reveal the fundamental flaw in both his evolutionary theory and that of this latest incarnation of creationism.

Wallace’s scientific heresy was first delivered in the April, 1869 issue of The Quarterly Review, in which he outlined what he saw as the failure of natural selection to explain the enlarged human brain (compared to apes), as well as the organs of speech, the hand, and the external form of the body:

In the brain of the lowest savages and, as far as we know, of the prehistoric races, we have an organ…little inferior in size and complexity to that of the highest types…. But the mental requirements of the lowest savages, such as the Australians or the Andaman Islanders, are very little above those of many animals. How then was an organ developed far beyond the needs of its possessor? Natural Selection could only have endowed the savage with a brain a little superior to that of an ape, whereas he actually possesses one but very little inferior to that of the average members of our learned societies.

(Please note the language that, were we to judge the man solely by his descriptors for indigenous peoples, would lead us to label Wallace a racist even though he was in his own time what we would today call a progressive liberal.)

Since natural selection was the only law of nature Wallace knew of to explain the development of these structures, and since he determined that it could not adequately do so, he concluded that “an Overruling Intelligence has watched over the action of those laws, so directing variations and so determining their accumulation, as finally to produce an organization sufficiently perfect to admit of, and even to aid in, the indefinite advancement of our mental and moral nature.”

Natural selection is not prescient—it does not select for needs in the future. Nature did not know we would one day need a big brain in order to contemplate the heavens or compute complex mathematical problems; she merely selected amongst our ancestors those who were best able to survive and leave behind offspring. But since we are capable of such sublime and lofty mental functions, Wallace deduced, clearly natural selection could not have been the originator of a brain big enough to handle them. Thus the need to invoke an “Overruling Intelligence” for this apparent gap in the theory.

Why did Wallace retreat from his own theory of natural selection when it came to the human mind? The answer, in a word, is hyper-selectionism (or adaptationism), in which the current adaptive purpose of a structure or function must be explained by natural selection applied to the past. Birds presently use wings to fly, so if we cannot conceive of how natural selection could incrementally select for fractional wings that were fully functional at each partial stage (called “the problem of incipient stages”) then some other force must have been at work. Darwin answered this criticism by demonstrating how present structures serve a purpose different from the one for which they were originally selected. Partial wings, for example, were not poorly designed flying structures but well designed thermoregulators. Stephen Jay Gould calls this process “exaptation” (ex-adaptation) and uses the Panda’s thumb as his type specimen: it is not a poorly designed thumb but a radial sesamoid (wrist) bone modified by natural selection for stripping leaves off bamboo shoots.

Wallace’s hyperselectionism and adaptationism were outlined more formally in an 1870 paper, “The Limits of Natural Selection as Applied to Man,” in which he admitted up front the danger of proffering a force that is beyond those known to science: “I must confess that this theory has the disadvantage of requiring the intervention of some distinct individual intelligence…. It therefore implies that the great laws which govern the material universe were insufficient for this production, unless we consider…that the controlling action of such higher intelligences is a necessary part of those laws….”

After an extensive analysis of brain size differences between humans and non-human primates, Wallace then considers such abstractions as law, government, science, and even such games as chess (a favorite pastime of his), noting that “savages” lack all such advances. Even more, “Any considerable development of these would, in fact, be useless or even hurtful to him, since they would to some extent interfere with the supremacy of those perceptive and animal faculties on which his very existence often depends, in the severe struggle he has to carry on against nature and his fellow-man. Yet the rudiments of all these powers and feelings undoubtedly exist in him, since one or other of them frequently manifest themselves in exceptional cases, or when some special circumstances call them forth.”

Therefore, he concludes, “the general, moral, and intellectual development of the savage is not less removed from that of civilised man than has been shown to be the case in the one department of mathematics; and from the fact that all the moral and intellectual faculties do occasionally manifest themselves, we may fairly conclude that they are always latent, and that the large brain of the savage man is much beyond his actual requirements in the savage state.” Thus, “A brain one-half larger than that of the gorilla would, according to the evidence before us, fully have sufficed for the limited mental development of the savage; and we must therefore admit that the large brain he actually possesses could never have been solely developed by any of those laws of evolution…. The brain of prehistoric and of savage man seems to me to prove the existence of some power distinct from that which has guided the development of the lower animals through their ever-varying forms of being.”

The middle sections of this lengthy paper review additional human features that Wallace could not conceive of being evolved by natural selection: the distribution of body hair, naked skin, feet and hands, the voice box and speech, the ability to sing, artistic notions of form, color, and composition, mathematical reasoning and geometrical spatial abilities, morality and ethical systems, and especially such concepts as space and time, eternity and infinity. “How were all or any of these faculties first developed, when they could have been of no possible use to man in his early stages of barbarism? How could natural selection, or survival of the fittest in the struggle for existence, at all favour the development of mental powers so entirely removed from the material necessities of savage men, and which even now, with our comparatively high civilisation, are, in their farthest developments, in advance of the age, and appear to have relation rather to the future of the race than to its actual status?”

Modern Intelligent Design creationists generally (with few exceptions) believe that the designer is God. Nowhere in this paper does Wallace invoke God as the overarching intelligence. In a footnote in the second edition of the volume in which this paper was published, in fact, Wallace upbraids those who accused him of such speculations:

Some of my critics seem quite to have misunderstood my meaning in this part of the argument. They have accused me of unnecessarily and unphilosophically appealing to “first causes” in order to get over a difficulty—of believing that “our brains are made by God and our lungs by natural selection;” and that, in point of fact, “man is God’s domestic animal.” … Now, in referring to the origin of man, and its possible determining causes, I have used the words “some other power”—“some intelligent power”—“a superior intelligence”—“a controlling intelligence,” and only in reference to the origin of universal forces and laws have I spoken of the will or power of “one Supreme Intelligence.” These are the only expressions I have used in alluding to the power which I believe has acted in the case of man, and they were purposely chosen to show that I reject the hypothesis of “first causes” for any and every special effect in the universe, except in the same sense that the action of man or of any other intelligent being is a first cause. In using such terms I wished to show plainly that I contemplated the possibility that the development of the essentially human portions of man’s structure and intellect may have been determined by the directing influence of some higher intelligent beings, acting through natural and universal laws.

Clearly Wallace’s heresy had nothing to do with God or any other supernatural force, as these “natural and universal laws” could be fully incorporated into the type of empirical science he practiced. It was not spiritualism, but scientism at work in Wallace’s world-view: “These speculations are usually held to be far beyond the bounds of science; but they appear to me to be more legitimate deductions from the facts of science than those which consist in reducing the whole universe…to matter conceived and defined so as to be philosophically inconceivable.”

In Wallace’s science there is no supernatural. There is only the natural and unexplained phenomenon yet to be incorporated into the natural sciences. That he left no room in his evolutionary theory for exaptations of early structures for later use is no reflection on his ambitions and abilities as a scientist. It was, in fact, one of Wallace’s career goals to be the scientist who brought more of the apparent supernatural into the realm of the natural, and the remainder of his life was devoted to fleshing out the details of a scientism that encompassed so many different issues and controversies that made him a heretic-scientist.

If modern Intelligent Design theorists restricted their visage to only natural causes they would, perchance, be taken more seriously by the scientific community, who at present (myself included) sees this movement as nothing more than another species of the genus Homo creationopithicus.

Read Flannery’s reply to my opening statement and tune into Skepticblog in a couple weeks for my response to him.

Comments Off

More God, Less Crime or More Guns, Less Crime?

More God, Less Crime (book cover)

During the last week of 2011, I spoke at and attended a wonderful salon in Santa Fe, New Mexico organized and hosted by Sandy Blakeslee, the brilliant science writer for the New York Times and the author of numerous engaging popular books on neuroscience. Two of the speakers at the salon addressed the topic of the decline of crime, one (Byron Johnson) attributing it to god and the other (John Lott) to guns. Of the two, Lott by far took the day with superior data and better arguments, although for a much wider and deeper analysis of the decline of violence in general I highly recommend Steven Pinker’s The Better Angels of Our Nature: Why Violence Has Declined (Viking, 2011), which I recently reviewed in these pages.

More Guns, Less Crime (book cover)

Byron Johnson is a professor at Baylor University and the founding director of the Baylor Institute for Studies of Religion as well as director of the Program on Prosocial Behavior. Acknowledging that he took the title of his book, More God, Less Crime: Why Faith Matters and How It Could Matter More (Templeton Press, 2011) directly from Lott’s book, More Guns, Less Crime: Understanding Crime and Gun Control Laws (University of Chicago Press, 2010), Johnson mostly recounted his experiences working with prisoners in an attempt to lower recidivism rates by increasing religiosity…of the Christian variety, of course. What few data slides he presented harmed his case more than helped it by being either impossible to read (dark, small type) or countering his claim (one slide showed no difference in post-conversion crime rates). Even his anecdotes seemed to gainsay his thesis, as in recounting the story of one man who even after converting to Christianity refused to confess his crime of rape and murder of a young girl until he met her mother on the day of his execution, at which point he broke down and apologized to her. Additional anecdotes and frank admissions by Johnson only worsened his case, such as that many prisoners only convert in order to impress parole boards, and that many of his fellow Christians (he called them “high octane” evangelicals) were only in the game to tally up conversion scores in an environment ripe for the picking. (I routinely receive letters from prisoners who bemoan the constant evangelizing, not only by Christians but by Muslims as well who also see prisons as conversion opportunities. As the Russian comedian Yavak Smirnoff used to joke about performing in the USSR, mixing “captured” for “captive” audiences: “they’re not going anywhere!”)

Johnson seems like a nice enough fellow, and with our current overcrowded prison system letting criminals out early, if he really can lower recidivism rates it’s hard not to acknowledge that this is a good thing for society (assuming he’s having any effect at all, which I presume he must be at least on a case-by-case basis, even if it isn’t statistically significant from other recidivism methods). Although I would much prefer that people not commit crimes for rational and secular moral reasons (respect for private property, sanctity of life, etc.), I am reminded of an encounter I had with a young Christian man in his early 20s during the Q & A after one of my public lectures. I had just asked the rhetorical question—which I often ask during my talk on the evolution of morality and how to be good without god—“What would you do if there were no God? Would you rape, steal, and murder?” Naturally people agree that they wouldn’t, but in this instance the man said he was pretty sure that if he decided that there were no god he would do just that. I told him that Jesus loves him and has a plan for his life and future. It got a laugh but everyone in the room realized that not everyone is a rational calculator and moral reasoner. Some people may very well need the shadow of enforcement that comes from believing in an invisible policeman in the sky who, like those pesky red light video cameras at busy intersections, insures that even when the cops aren’t around all sins and violations will be settled in due time, even without due process.

As far as I know Johnson, along with his fellow religious believers who embrace the hypothesis that religion is good for society, have failed to account for a simple and obvious (once you think about it) correlation and comparison: Gregory Paul’s 2005 study published in the Journal of Religion and Society—“Cross-National Correlations of Quantifiable Societal Health with Popular Religiosity and Secularism in the Prosperous Democracies”—that showed an inverse correlation between religiosity (measured by belief in God, biblical literalism, and frequency of prayer and service attendance) and societal health (measured by rates of homicide, suicide, childhood mortality, life expectancy, sexually transmitted diseases, abortion, and teen pregnancy) in 18 developed democracies. “In general, higher rates of belief in and worship of a creator correlate with higher rates of homicide, juvenile and early adult mortality, STD infection rates, teen pregnancy, and abortion in the prosperous democracies,” Paul found. “The United States is almost always the most dysfunctional of the developed democracies, sometimes spectacularly so.” Indeed, the U.S. scores the highest in religiosity and the highest (by far) in homicides, STDs, abortions, and teen pregnancies.

If religion is such a powerful prophylactic against sin, immorality, and crime, then why is the most religious democracy on the planet also the most sinful and crime-ridden? I’m not claiming that religion causes these problems (although Paul does make this claim), only that the claim that it prevents or attenuates them is falsified by the data.

John Lott, by contrast, is a social scientists’ social scientist. A data man to the core, I spent several hours with him the night before at a party pressing him for details of his argument that more guns means less crime. He was unwavering in his conviction—both to me privately and in his public talk (and in his book)—that not one social scientist or criminologist has been able to produce a single example of a city or county that has experienced a consistent decline in crimes after a ban on guns was enacted. In fact, in slide after slide and example after example Lott showed that the opposite correlation tends to be the case: gun bans increase crime.

Take Washington, D.C. Before the ban on handguns was implemented in August of 1976, DC ranked 20th in murder rates out of the top 50 cities in America. After the gun ban, DC shot up to either #1 or #2, where year after year it held steady as “the murder capital of the nation,” as it as dubbed by the media. As a control experiment of sorts, after the Supreme Court decision in the Heller case overturned the DC gun ban, murder rates dropped and have continued to fall ever since. According to Lott, whose data is based primarily on crime statistics provided by the FBI, once the gun ban was lifted, homicide rates plummeted 42.1%, sexual assault rates dropped 14.9%, robbery excluding guns dropped 34.3%, robbery with guns plunged 58%, assault with a dangerous weapon excluding guns sank 11%, assault with a dangerous weapon using guns tumbled 35.6%, and total violent crime nosedived 31%, along with total property crimes decreasing a total of 10.7%.

Chicago showed a similar effect, Lott demonstrated. Ever since the gun ban was implemented in 1982, no year has been as low in crimes as it was before the ban. Island nations (which serve as good tests, Lott says, because their borders are more tightly controlled from extraneous variables) demonstrate the same effect: Jamaica and Ireland homicide rates increased after gun bans were imposed. Ditto England and Wales: After a gun ban was imposed in January of 1997, homicide rates slowly climbed and peaked at an average of 28% higher after the ban. (By dramatic contrast, Lott said that in 1900 London in which people were free to do whatever they wanted with their guns, there were a grand total of 2 gun-related deaths and 5 armed robberies in a population of many millions, and this was 20 years before gun laws began going into effect in 1920.)

Why do more guns mean less crime? Lott offers a very practical explanation: it is extremely hard to keep criminals from getting and keeping guns. In other words, Gun bans are primarily obeyed by non-criminals. Criminals that already have guns do not turn them in, and potential criminals that want to get guns have no problem procuring them on the street illegally. Lott cited several studies by criminologists who interviewed criminals in jail and collected data on the amount of time they spend casing a home before burglarizing it. In the U.K., where gun bans are much more prevalent than in the U.S., the criminals reported that they spend very little time casing a joint and that they don’t really care if someone is home or not because they know the residents won’t be armed (whereas they, of course, are armed). Their U.S. counterparts, by contrast, reported spending more than double the time casing a home before robbing it, explaining that they were waiting for the residents to leave. Why? They said that they were worried they would be shot.

Why is crime so much higher here in the U.S. than in the U.K. and elsewhere? Lott explained that the remarkably high homicide rates are a geographical anomaly. The U.S. justice department reports that about 80% of violent crimes are drug gang related, and that about 75% of homicides take place in 3% of counties. And even within those counties the murders are taking place in a tiny portion in which drug gangs are operating. So when we compare murder rates between countries—say between the U.S. and Canada—it is really comparing the crime in one country to just a very tiny portion of American cities where gangs proliferate. What would happen if drugs were legalized? Speaking as an economist who understands the basic law of supply and demand, Lott opined that there is no doubt that crimes would decrease while drug-use would increase. So it’s a trade-off.

I do not know this area well enough to judge the validity of Lott’s thesis. His data and his plausible causal explanations for the correlations strike me as sound, although I know that proponents of gun control have taken him to task over various statistical issues. Still, I would like to see his fundamental challenge met: is there any city or county in the U.S. where crime and murders have consistently decreased after gun control laws were passed and enforced?

Anecdotally, of course, we are horrified at the innocent people gunned down who would be alive were there no guns anywhere in the country. Just days before Lott’s lecture, in fact, there was the story about the U.S. soldier returning home from Iraq who was shot dead on Christmas day in a dispute over a football team. Had there not been guns in that home the worst thing that probably would have happened is a bit of pushing and shoving and shouting, perhaps a roundhouse punch or two thrown, and a couple of bruised egos in the end. But the problem is that the genie is out of the bottle. Millions of guns are already out there, and short of a Stasi-like police state sweep through every home, business, garage, shack, storage unit, cabin, car, and container in every nook and cranny in every state in the union, gun bans will most likely be honored by the people who least need them and ignored by those who do—the criminals.

Comments Off

Paleolithic Politics

Has there ever been a time when the political process has been so bipartisan and divisive? Yes, actually, one has only to recall the rancorousness of the Bush-Gore or Bush-Kerry campaigns, harken back to the acrimonious campaigns of Nixon or Johnson, read historical accounts of the political carnage of both pre- and post-Civil War elections, or watch HBO’s John Adams series to relive in full period costuming the bipartite bitterness between the parties of Adams and Jefferson to realize just how myopic is our perspective.

We can go back even further into our ancestral past to understand why the political process is so tribal. But for the business attire donned in the marbled halls of congress we are a scant few steps removed from the bands and tribes of our hunter-gatherer ancestors, and a few more leaps afield from the hominid ancestors roaming together in small bands on the African savannah. There, in those long-gone millennia, were formed the family ties and social bonds that enabled our survival among predators who were faster, stronger, and deadlier than us. Unwavering loyalty to your fellow tribesmen was a signal that they could count on you when needed. Undying friendship with those in your group meant that they would reciprocate when the chips were down. Within-group amity was insurance against the between-group enmity that characterized our ancestral past. As Ben Franklin admonished his fellow revolutionaries, we must all hang together or we will surely hang separately.

In this historical trajectory our group psychology evolved and along with it a propensity for xenophobia—in-group good, out-group bad. Thus it is that members of the other political party are not just wrong—they are evil and dangerous. Stray too far from the dogma of your own party and you risk being perceived as an outsider, an Other we may not be able to trust. Consistency in your beliefs is a signal to your fellow group members that you are not a wishy-washy, Namby Pamby, flip-flopper, and that I can count on you when needed.

This is why, for example, the political beliefs of members of each party are so easy to predict. Without even knowing you, I predict that if you are a liberal you read the New York Times, listen to NPR radio, watch CNN, hate George W. Bush and loathe Sarah Palin, are pro-choice, anti-gun, adhere to the separation of church and state, are in favor of universal healthcare, vote for measures to redistribute wealth and tax the rich in order to level the playing field, and believe that global warming is real, human caused, and potentially disastrous for civilization if the government doesn’t do something dramatic and soon. By contrast, I predict that if you are a conservative you read the Wall Street Journal, listen to conservative talk radio, watch Fox News, love George W. Bush and venerate Sarah Palin, are pro-life, anti-gun control, believe that America is a Christian nation that should meld church and state, are against universal healthcare, vote against measures to redistribute wealth and tax the rich, and are skeptical of global warming and/or government schemes to dramatically alter our economy in order to save civilization.

Research in cognitive psychology shows, for example, that once we commit to a belief we employ the confirmation bias, in which we look for and find confirming evidence in support of it and ignore or rationalize away any disconfirming evidence. In one experiment subjects were presented with evidence that contradicted a belief they held deeply, and with evidence that supported those same beliefs. The results showed that the subjects recognized the validity of the confirming evidence but were skeptical of the value of the disconfirming evidence. The confirmation bias was poignantly on display during the run-up to the 2004 Bush-Kerry Presidential election when subjects had their brains scanned while assessing statements by both Bush and Kerry in which the candidates clearly contradicted themselves. Half of the subjects were self-identified as “strong” Republicans and half “strong” Democrats. Not surprisingly, in their assessments Republican subjects were as critical of Kerry as Democratic subjects were of Bush, yet both let their own preferred candidate off the evaluative hook. The brain scans showed that the part of the brain most associated with reasoning—the dorsolateral prefrontal cortex—was quiet. Most active were the orbital frontal cortex that is involved in the processing of emotions, the anterior cingulate that is associated with conflict resolution, and the ventral striatum that is related to rewards.

In other words, reasoning with facts about the issues is quite secondary to the emotional power of first siding with your party and then employing your reason, intelligence, and education in the service of your political commitment.

Our political parties today evolved out of the Paleolithic parties of the past.

Comments Off

As Far As Her Eyes Can See

A review of Lisa Randall’s Knocking on Heaven’s Door: How Physics and Scientific Thinking Illuminate the Universe and the Modern World (Ecco, 2011).

LISA RANDALL HAS BEEN JUSTLY APPRAISED by Time magazine as one of the “100 most influential people in the world” for her work in theoretical particle physics. From her position at Harvard University, she often travels: to the European Laboratory for Particle Physics, CERN, in Switzerland, where her theories are being put to the test in the Large Hadron Collider (LHC); to speaking engagements with professional and public audiences about her work in particular and the awe and wonder of science in general; and to rock formations where her chalked fingers can find ways to defy gravity. On the side, she writes popular books, such as her acclaimed Warped Passages1.

In Knocking on Heaven’s Door, Randall picks up the story from where she left off when the LHC was years away from first collision, expanding her horizon from, as she poetically puts it, “what’s so small to you is so large to me” to “what’s so large to you is so small to me.” In other words, the book ranges from the smallest known particles to the entire bubble universe, from 10−35 meters (the Planck length, where quantum gravity rules) to 1027 meters (the entire visible universe, 100 billion light-years across, where dark matter and dark energy dominate), a stunning 62 orders of magnitude. (Randall correctly notes the age of the universe at 13.75 billion years, clarifying her apparently paradoxical figure of 100 billion light-years thusly: “The reason the universe as a whole is bigger than the distance a signal could have traveled given its age is that space itself has expanded.” She unpacks that sentence in the book.) (continue reading…)

read or write comments (4)
« previous pagenext page »