The greatest enemy of knowledge is not ignorance- it is the illusion of knowledge.
– Daniel J. Boorstin
Why do researchers give placebos? Why do they have control conditions and double-blind studies? Why do they follow Karl Popper’s lead and try to prove that their pet theories are wrong instead of right? At least part of the answer to these questions has to do with the king of all the biases: The Confirmation Bias.
To understand the confirmation bias let’s start with a story written long before psychologists even knew such a bias existed. We begin with a scene from Mark Twain’s The Adventures of Tom Sawyer.
Tom has just dug up a wooden box that he hid beneath a log and says “What hasn’t come here, come! What’s here, stay here!” He opens the box and to his disappointment finds a single marble inside.
Then he tossed the marble away pettishly, and stood cogitating. The truth was, that a superstition of his had failed, here, which he and all his comrades had always looked upon as infallible. If you buried a marble with certain necessary incantations, and left it alone a fortnight, and then opened the place with the incantation he had just used, you would find that all the marbles you had ever lost had gathered themselves together there, meantime, no matter how widely they had been separated. He had many a time heard of this thing succeeding, but never of its failing before. It did not occur to him that he had tried it several times before, himself, but could never find the hiding places afterwards. He puzzled over the matter some time, and finally decided that some witch had interfered and broken the charm. He thought he would satisfy himself on that point; so he searched around till he found a small sandy spot with a little funnel-shaped depression in it. He laid himself down and put his mouth close to this depression and called:
“Doodle-bug, doodle-bug, tell me what I want to know! Doodle-bug, doodle-bug tell me what I want to know!”
The sand began to work, and presently a small black bug appeared for a second and then darted under again in fright.
“He dasn’t tell! So it was a witch that done it. I just knowed it.”
Mark Twain was a keen observer of the cognitive biases before they were known by science. Tom is faced with evidence that his magic didn’t work, yet rather than take it as evidence that disconfirms magic, he busily looks for magical evidence to confirm it in another way. That poor doodle-bug had no idea he was being yelled at in the service of a superstition.
The initial research that identified the confirmation bias was published in 1960 by the British psychologist Peter Wason. The study he designed, like most great research, was very simple. A person was given three numbers that follow a rule. For example, “2-4-6.” The person’s job was to figure out what the rule was. Before they submitted their answer they could test it an unlimited number of times by giving Wason number sets of their own creation. Wason would say whether those sets followed the rule or not. Once the person had tested enough and were sure they had the right answer they wrote it down and submitted it to Wason. With unlimited testing it should be simple right? Not exactly.
Most people got it wrong.
The reason people kept failing at Wason’s simple number game was the same reason Tom yelled at the doodle-bug. They only tested to see if their guess was right, but not to see if it was wrong. If Wason gave the set 2-4-6 then the person often guessed that the rule was “increases by two.” But stop a moment and ask yourself: how do you test that rule to see if it is true? If you thought to test it with sets such as “10-12-14” or “8-10-12” then you are like most people. To test if the rule is right, submit number sets that follow it and see if they are right. If you did this Wason would nod and affirm that your test sets do indeed conform to the rule. But if you made “increases by two” your answer, you would be wrong. And you would, like almost everyone else who played the game, be very surprised to discover that the rule was actually “each number in the set is larger than the last.”
What Wason discovered is not just a problem that people have with math, but a flaw in the very way that people think about the nature of evidence. He found that people consistently fail to test their ideas in ways that would show they might be wrong. Instead, people intuitively do the opposite. They look to see if they are right. In the case above all you would need to do to discover that the rule is wrong is submit the set 1-2-3 to Wason. He would nod and tell you that this set follows the rule and instantly you would know that “increases by two” cannot be right. But over and over again people never thought to do this. Wason coined this tendency the “confirmation bias” (Wason, 1960).
Confirmation bias is the bedrock of all the biases. Once we believe that something is true, even if we arrived at that conclusion incorrectly, with bad evidence, or even just bad guesswork, we then seek ways to confirm what we already believe. If you believe that the COVID vaccine is dangerous then you will look for testimonials and anecdotes that confirm it and ignore the millions of safe vaccinations. If you believe that you can tell the future from your dreams, you will pay close attention to times when it seemed like your dreams tipped you off while ignoring the times your dreams missed the mark. If you believe that the election was stolen then you will search out news sources that say you are right, and dismiss all others as “fake news.” When you step back and look at the big picture, that small glitch in our mind’s software that Wason discovered has huge implications in our modern world.
It could be argued that the scientific method, which focuses on falsifying hypothesis rather than proving them correct, is needed by our species because this particular cognitive glitch is so strong. If we encountered an alien species without the confirmation bias, their scientific method might look different. The confirmation bias influences us in so many ways day-to-day that we hardly notice it at all, but once we become aware of it we see it in our own thinking all the time.
Confirmation bias has become a much stickier problem as the internet becomes people’s primary source for information. Social media and search algorithms are confirmation bias engines, feeding us information that that the algorithm guesses we will like. And what we like are bits of information that confirm our beliefs. Social media is a kind of bias magnifier, giving each user tools to more easily like, look for, and share information that confirms our beliefs. Cable news programs are another steady source of confirmation bias, interpreting events in ways that confirm the beliefs of their loyal audience. The confirmation bias not only supports the beliefs we have by cherry picking the things we pay attention to, it now drives us to select what sources of information we follow. It narrows the scope of what we see and hear so that the world appears to conform to our beliefs.
Given how important this bias is, it is especially helpful for kids to learn about it early. You can teach your kid about biases like these by reading about characters like Tom Sawyer, doing activities together that highlight the importance of the confirmation bias and avoiding it (like science projects), and point it out whenever you spot it. By raising kids to look out for the confirmation bias we can give them a leg up on Tom and that poor doodle-bug. Who knows, they may be able to avoid the traps that social media and cable news set for modern audiences.
Stay tuned for more activities in the weeks ahead to do with your little skeptical kid to teach them about this important bias.
Want to read a fun adventure story with your kids that teaches the cognitive biases? Check out the Beyond Belief series. In book two the hero foils a plot by a sinister pyramid scheme to take over the world! Loaded with critical thinking skills and concepts each book is a fun way to learn and grow!