The Dark Side of Anecdotes


The power of anecdotes to persuade (see Why Stories Sell and Your Brain on Stories) is established, but there’s a dark side to that power. Quite simply, an effective story can take over our brains to the point where we disregard more valid information: reliable statistics, the opinions of true experts, and so on.

In the Invisible Gorilla, authors Christopher Chabris and Daniel Simons note that measles, a disease that had been virtually eradicated in the United States, has been popping up with outbreaks of increasing frequency. Why is that happening? The biggest cause is that the number of families who consciously opt out of normal childhood vaccinations is increasing. In an environment of universal vaccination, even if a carrier of the disease arrives from another county, there is no possibility of an outbreak occurring. If, however, a sizable minority of the population is NOT vaccinated, the illness can be readily spread among that group while the carriers are not exhibiting symptoms.

A key part of the reason for the lack of universal vaccination is that some people believe that childhood vaccinations cause autism. While there is a mountain of scientific evidence showing that this is not the case, celebrity Jenny McCarthy became a vocal spokesperson for the vaccine/autism connection. Her efforts included an emotional appearance on the Oprah Winfrey show. The combination of Oprah’s uncritical acceptance of McCarthy’s beliefs and the enormous audience of her show was a key step in launching the anti-vaccine movement. Despite widespread debunking by researchers, doctors, and infectious disease specialists, many people still believe in the supposed autism/vaccine link.

Multiple Brain Fails

Actually, this episode illustrates several ways our brains fail us. First, our brains want to find patterns in our observations, particularly cause-effect relationships. At an earlier stage of humanity, this served us well. If we ate a new fruit, and became ill, we would avoid it in the future. Even if there was really another reason for the illness, this strategy was likely to be better geared to survival than, say, trying the fruit three times before concluding that it was dangerous. In today’s complex world, relationships are less obvious. If we received an injection and the site of the injection swelled up like a balloon, we would be justified in concluding that the injection was a problem. Conversely, if we received the same injection and our car was stolen, we would dismiss it as coincidence.

What if we received an injection and, a week later, began to develop headaches? At least a few of us might think there was a connection. And if we had a friend comment on having headaches, and the friend confirmed that he had the same shot a month before, we might feel justified in making a connection. The problem with anecdotal evidence is that it is entirely uncontrolled. The injection/headache link might be a total coincidence, or external factors could combine to establish a false link. For example, everyone might get flu shots within a narrow time frame. Any subsequent problems – high pollen counts, unusual weather conditions, etc., might cause symptoms in multiple flu shot receivers who could compare notes and blame the shot.

It seems that’s what’s happening with autism. Symptoms of autism are generally becoming more evident at an age not long after vaccines are given, and a few people saw cause and effect. A since discredited medical study fanned the flames and, of course, once the meme began to spread, others interpreted the coincidence as cause and effect too.

The second failure is one of “confirmation bias.” Once we have an established belief, we tend to accept information which supports that belief and reject whatever does not. Hence, the publication of research showing no linkage between childhood vaccines and autism didn’t dissuade as many people from the belief as one might expect.

The combination of a powerful anecdote transmitted via a trusted source, our brain’s need to find cause and effect relationships, and the difficulty in changing beliefs due to confirmation bias work together create an environment where sound science can be trumped by misinformation. This is indeed the dark side of effective anecdotes. Use this power wisely.

  1. Régis Kuckaertz says

    The combination of anecdotes + confirmation bias looks a lot like a cognitive dissonance where one almost automatically takes the “wrong” path. An example: lots of web designers get the fact that web design is 95% typography, but most of the time they take the rules of (print) typography for granted and apply them blindly to the web. It is not surprising: we have several centuries of experience during which the rules of typography have evolved and proved to be effective… who would question that?

    It is much more easy to think that ‘if it worked for 500 years on paper then it should work on screen’ than to say ‘we have to rethink everything from the ground up’.

    Have you ever heard of a solution to countermeasure the confirmation bias?

    (on a side note, you might want to edit your post, the whole ‘Multiple Brains Fail’ section appears twice)

  2. Roger Dooley says

    Thanks, Regis. There was some kind of connection fail when posting, and somehow the text got doubled.


  3. Rick Hardy says

    Roger, excellent post! You’ve nailed an important point. I’d just add that I think a lack of critical thinking, a suspicion of data, and a tendency to believe in conspiracy theories, all of which are inherent in your post, are significant reasons why some believe stories in spite of conflicting evidence. In some ways, it’s understandable: what we’re told about things seems to change from year to year (e.g., dietary studies). Some of this may be due to news coverage of research studies that are about correlation but reported as causation. So, we develop a suspicion of research and data, and don’t recognize legitimate studies from others. Of course, this is all a matter of critical thinking and a pop culture environment…

    Good stuff. I enjoy your blog.

  4. kare anderson says

    A helpful follow-up to this valuable post on how a powerful story and Attribution Bias can lead us astray is the book, On Being Certain by Robert Burton. I wrote about it here

  5. Mike Kirkeberg says

    The power of celebrity can be amazing. I used to work with kids, many of who thought that having ‘the’ pair of shoes worn (and sold) by the basketball player of the moment would ‘up their game’ as they would say. I suppose placebo would make this true, until they missed a few shots.

  6. Mike Kirkeberg says

    By the way, I used to subscribe to your blog and it somehow got lost over time. I just found it again through a tweet from Brian Clark @copyblogger.

Leave A Reply

Your email address will not be published.