In a recent episode of Slate Money, Felix Salmon was talking about antivaxxers*, and specifically, about the Effective Messages in Vaccine Promotion: A Randomised Trial study which was published in the Pediatrics Journal in April 2014.
*I’m really not sure if that’s the right spelling. But I’m going with it.

The Stubbornness of Conviction

In the study, 1,759 parents were exposed to one of four promotional messages for vaccination:

  1. One group received information explaining how autism is not linked to the Measles-Mumps-Rubella (MMR) vaccine, including references to studies that show there is no link.
  2. A second group received information on the dangers of the diseases prevented by the MMR vaccine.
  3. A third group received images of children suffering from Measles, Mumps and Rubella.
  4. A fourth group received a “dramatic narrative about an infant who almost died from measles”.

There was also a fifth group that received an unrelated promotional message on the costs and benefits of bird feeding (they were the control group). In case you’re interested, the full script of each of those promotional messages can be found here: Supplementary Information.

The results:

  1. None of those promotional messages increased the likelihood of vaccination.
  2. And more awkwardly: antivaxxer parents were even less likely to have their children vaccinated after reading the informational material.

This is worse than a simple case of confirmation bias where you only accept evidence that confirms what you already believe. This is contradictory evidence backfiring to make the unconvinced even more unconvinced than they were before.

And to be clear, I’m not saying that antivaxxers are stupid or misinformed (even if I’m pro-vaccination). But I am saying that, logically, if you’re presented with strong evidence that contradicts your position, then you should at least have some doubt – at least until you find further research that refutes said strong evidence. It’s simply irrational to feel affirmed by strong evidence to the contrary.

I couldn’t really find a name for this type of bias, although it’s obviously widespread. That said, frankly, I don’t think we’re dealing with biased thinking here because this type of situation sounds a lot like religious fanaticism conviction. Which isn’t really “thinking” per se.

What to make of this?

Well, before I get there, I want to talk about another form of bias: the Dunning-Kruger Effect.

“80% of drivers think they’re above average”

In 1999, David Dunning and Justin Kruger from Cornell University conducted a series of studies into the self-assessment of competence. Entertainingly:

“The study was inspired by the case of McArthur Wheeler, a man who robbed two banks after covering his face with lemon juice in the mistaken belief that, because lemon juice is usable as invisible ink, it would prevent his face from being recorded on surveillance cameras.”

The study was titled “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments”, and here’s the abstract:

People tend to hold overly favorable views of their abilities in many social and intellectual domains. The authors suggest that this overestimation occurs, in part, because people who are unskilled in these domains suffer a dual burden: Not only do these people reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the metacognitive ability to realize it. Across 4 studies, the authors found that participants scoring in the bottom quartile on tests of humor, grammar, and logic grossly overestimated their test performance and ability. Although their test scores put them in the 12th percentile, they estimated themselves to be in the 62nd. Several analyses linked this miscalibration to deficits in metacognitive skill, or the capacity to distinguish accuracy from error. Paradoxically, improving the skills of the participants, and thus increasing their metacognitive competence, helped them recognize the limitations of their abilities.

To summarise, incompetent people:

  1. don’t know things; and
  2. don’t know enough to know how much they don’t know; and
  3. don’t know enough to know how much more other people know than they do.

In the meanwhile, they’ll blissfully continue on thinking that they know everything. And, moreover, that they know a lot more than anyone else.

experts

non-experts

And as for competent people:

  1. Competent people know things; and
  2. they know enough to know that they don’t know everything; but
  3. apparently, they underestimate how much other people don’t know.

And finally, incompetent people “recognise and acknowledge their lack of skill only after they are exposed to training for that skill”.

Which only goes to confirm:

“Real knowledge is to know the extent of one’s ignorance” ~ Confucius

“The Foole doth thinke he is wise, but the wiseman knowes himselfe to be a Foole” ~ Shakespeare

“Ignorance more frequently begets confidence than does knowledge” ~ Charles Darwin

“One of the painful things about our time is that those who feel certainty are stupid, and those with any imagination and understanding are filled with doubt and indecision” ~ Bertrand Russell

Which is why 80% of drivers can believe that they’re above average: some bad drivers are so bad at driving that they don’t even know what good driving is, let alone that they aren’t close to achieving it.

So bearing all of that in mind, here are two separate points that need to be brought together:

  1. Ignorant people are unaware of their own ignorance, and this can only be fixed with education.
  2. There are some forms of ignorance which not only cannot be fixed with education, but are actually made worse by it.

 

Just a hypothetical.

Rolling Alpha posts about finance, economics, and sometimes stuff that is only quite loosely related. Follow me on Twitter @RollingAlpha, or like my page on Facebook at www.facebook.com/rollingalpha. Or both.