Monday, September 22, 2008

Superstition: Was Stevie Wonder Wrong?

Having grown up in a fairly superstition-free household (exception: see my mother’s diet books), I have always experienced a good deal of discomfort when confronted with a real-life, authentic subscriber to the more overt and antiquated superstitions of our time. My first experience with superstition occurred while walking down the sidewalk with my friend Helen in the first grade. I can’t remember the exact circumstances, but I know that she suddenly shouted at me, triumphantly and mockingly, “stepped on a crack, you’ll break your mother’s back!” When pressed for an explanation linking cause and effect, Helen retreated into a self-satisfied smirk, and advised me with her arms crossed over her chest to “wait and see.”

In the end, Helen proved fond of spurning appeals to reason. Our friendship ended abruptly during the summer after first grade, on the swing-set behind her house. I had done something wrong, and followed up the misdeed with an apology. In response, she told me that “sorry doesn’t feed the bulldog.” Then she ran inside, to tell on me to her mother, who took her side.

The reason I bring up the matter of superstition now, in the first post of a blog supposedly devoted to science: a study published last week by the Royal Society, exploring the evolutionary basis for superstition and superstitious-like behavior. The study found, using a mathematical model, that it does indeed benefit an organism to adopt superstitious tendencies—to assign, incorrectly, a link between various causes and effects. The reason why? Something analogous to Pascal’s Wager—the argument for belief in God which holds that the possible benefit outweighs the possible risk (while the decision theory used in the argument is sound, the Wager itself is now understood to be based on a number of problematic assumptions). Taken into the realm of natural selection, the idea goes that the risk associated with holding a superstition (assigning false causality) is less than the risk of failing to assign causality when it does, indeed, exist.

Here in New York City, this scientific finding may help explain the wild success enjoyed by our perpetually-thriving population of rats. Rats have a number of things going for them in terms of survival—namely, a love of sex coupled with an iron-clad reproductive system, and a keen ability to avoid our attempts to kill them. Rats can taste poison in as small a dosage as .5 parts per million (think two grains of salt in a pound of peanut butter). Also—and here’s where the superstition comes in—rats pick up social cues about what to eat and what not to eat. If a rat returns to its burrow healthy and reeking of hangar steak from the restaurant down the alley, the other rats in the burrow will learn that the hangar steak is potentially safe. If, on the other hand, an unwell rat returns to the burrow smelling of, let’s say, the same restaurant’s ratatouille, the rats will be likely to consider the ratatouille a potential risk and avoid eating it. While these assignments of causality are probably wrong most of the time, they are more likely to benefit the community of rats than to harm it. Thus: superstition as an adaptive behavior.

I hate to think that Helen’s persistence in superstition could be, in any way, of evolutionary benefit. I also hate to think that the boy I (and everyone else) knew in high school who refused to wash his jock strap for the entire football season was somehow acting on a successful survival-related impulse. And yet, there’s something comforting in the idea that our own moronic conclusions can help keep us alive. Or is there?

Sources:
Article: "The evolution of superstitious and superstition-like behaviour"
Also, Jerry Langton's fun book about rats.