Preamble: following on from yesterday’s post, today I’m going to give my version of facts and why they’re not really that important. And I’m going to start with a re-telling of a study on the effects of education on anti-vaccination believers.
There is an episode of Slate Money in which Felix Salmon talks about antivaxxers*, and specifically, about the Effective Messages in Vaccine Promotion: A Randomised Trial study which was published in the Pediatrics Journal in April 2014.
*I’m really not sure if that’s the right spelling. But I’m going with it.
Facts: The Stubbornness of Conviction
In the study, 1,759 parents were exposed to one of four promotional messages for vaccination:
- One group received information explaining how autism is not linked to the Measles-Mumps-Rubella (MMR) vaccine, including references to studies that show there is no link.
- A second group received information on the dangers of the diseases prevented by the MMR vaccine.
- A third group received images of children suffering from Measles, Mumps and Rubella.
- A fourth group received a “dramatic narrative about an infant who almost died from measles”.
There was also a fifth group that received an unrelated promotional message on the costs and benefits of bird feeding (they were the control group). In case you’re interested, the full script of each of those promotional messages can be found here: Supplementary Information.
- None of those promotional messages increased the likelihood of vaccination.
- And more awkwardly: antivaxxer parents were even less likely to have their children vaccinated after reading the informational material.
This is worse than a simple case of confirmation bias where you only accept evidence that confirms what you already believe. This is contradictory evidence backfiring to make the unconvinced even more unconvinced than they were before.
Think of how crazy this is. Logically speaking, if you’re presented with strong evidence that contradicts your position, then you should at least have some doubts. Even if it only drives you to hunt for further research that refutes said strong evidence. It’s simply irrational to feel affirmed by strong evidence to the contrary.
I couldn’t really find a name for this type of bias, although it’s obviously widespread.
But now, let me also introduce another form of bias: the Dunning-Kruger Effect.
“80% of drivers think they’re above average”
In 1999, David Dunning and Justin Kruger from Cornell University conducted a series of studies into the self-assessment of competence. Entertainingly:
“The study was inspired by the case of McArthur Wheeler, a man who robbed two banks after covering his face with lemon juice in the mistaken belief that, because lemon juice is usable as invisible ink, it would prevent his face from being recorded on surveillance cameras.”
The study was titled “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments”. Here is the abstract:
People tend to hold overly favorable views of their abilities in many social and intellectual domains. The authors suggest that this overestimation occurs, in part, because people who are unskilled in these domains suffer a dual burden: Not only do these people reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the metacognitive ability to realize it. Across 4 studies, the authors found that participants scoring in the bottom quartile on tests of humor, grammar, and logic grossly overestimated their test performance and ability. Although their test scores put them in the 12th percentile, they estimated themselves to be in the 62nd. Several analyses linked this miscalibration to deficits in metacognitive skill, or the capacity to distinguish accuracy from error. Paradoxically, improving the skills of the participants, and thus increasing their metacognitive competence, helped them recognize the limitations of their abilities.
To summarise all that:
- there are people who don’t know things; and
- worse, they don’t know enough to know how much they don’t know; and
- worse still, that means they simply cannot comprehend how much more other people know than they do.
In the meanwhile, they’ll blissfully continue on thinking that they know everything. And, moreover, that they know a lot more than anyone else.
But on the other side of the scale:
- Skilled people know things; and
- they know enough to know that they don’t know everything; but
- apparently, they underestimate how much other people don’t know.
And finally, the unknowledgeable “recognise and acknowledge their lack of skill only after they are exposed to training for that skill”.
Which only goes to confirm:
“Real knowledge is to know the extent of one’s ignorance” ~ Confucius
“The Foole doth thinke he is wise, but the wiseman knowes himselfe to be a Foole” ~ Shakespeare
“Ignorance more frequently begets confidence than does knowledge” ~ Charles Darwin
“One of the painful things about our time is that those who feel certainty are stupid, and those with any imagination and understanding are filled with doubt and indecision” ~ Bertrand Russell
Which is why 80% of drivers can believe that they’re above average.
Most bad drivers are so bad at driving that they don’t even know what good driving is, let alone that they aren’t close to achieving it.
Putting those two biases together
Here is the problem:
- Ignorant people are unaware of their own ignorance, and this can only be fixed with education.
- But there are some forms of ignorance which not only cannot be fixed with education, but are actually made worse by it.
Which leaves us a bit stuck, doesn’t it?
It makes you wonder whether some countries haven’t already hit peak benefits from education, and are now only reaping negative returns.
In the form of increasing polarization.
Just a demographic thought.
Rolling Alpha posts about finance, economics, and sometimes stuff that is only quite loosely related. Follow me on Twitter @RollingAlpha, or like my page on Facebook at www.facebook.com/rollingalpha. Or both.