The Confirmation Bias Effect
We tend to favour evidence that backs up ideas we already believe in.
You know that turbo shout session you had with *insert person* at *insert place* at *insert time*, after which you furiously whipped out the internet to find any article that backed you up? Yep, I’m perpetually there. Opinions make us. Our beliefs, inextricably shaped by those we spend time with, define who we believe we are. They give us meaning, so it’s no wonder we’re so precious about them. It makes sense that we prefer the company of people whose opinions match our own; although an echo chamber of agreement doesn’t always make for the most stimulating debate, it’s likely a good source of comfort to find out that your fellow Facebookers just aren’t into getting piggy with it, Cameron style. (Alright, that was easy. And no one was cool with it.) Porcine profanation aside, birds of a feather flock together, so that makes it easy to feel righteous. Even when actively trying to research against belief, we overlook evidence that challenges our opinion in favour of that which backs it up, and that’s Confirmation Bias.
The Illusory Correlation also comes into play here — we perceive relationships between variables that demonstrate no such thing, especially to support our argument. If you find yourself auto-knee-jerking in the face of confrontation — to, say, that Morrisey is actually, really just an inexcusable twat— there’s a chance you’re ignoring the facts. (This could be delusion too, but let’s leave it there. Maybe you’ve no strong feelings about him either way.) Confirmation Bias leads us to favour data which backs up prior beliefs, even in the face of contradictory evidence. An offshoot of this that goes even deeper is the proposition of identity-protective cognition — that we cannot trust our own opinions, especially since our brains are biased towards protecting those opinions. Think gun control, conspiracy theories, and climate change — compelling data seems only to result in heels —not heads— being driven further into the ground, especially since those topics are highly politically charged.
The Barnum Effect
Generalisms become prophetical and profound.
An offshoot of Confirmation Bias, so named after the infamous hoaxer, Barnum. Here we have something a little more specific: people will give high accuracy to descriptions of their personality that they believe to be tailored to themselves, only those “personality traits” are general enough to be applicable to anyone. It’s the Cosmopolitan quiz equivalent of “yeeesss, I’d love an omelette right about now!” (Alright, maybe that was a stretch, but it’s one of the best Simpsons quotes.) You only hear what you want to hear, and you only read what you want to read between the lines.
Take horoscopes: the premonitions with each piece of ‘wisdom’ aren’t written obliquely by accident. Because people want to believe in the advice received, they’ll track back in their minds favourably to make it so, only taking into account where the story fits. In the same way that our brains are programmed to recognise faces, and therefore people see Jesus on pieces of toast, so we are programmed to find favourable evidence towards our own beliefs. It is the very definition of wishful thinking, but consider this: even Charlie Brooker won’t lay claim to being a fortune teller. Maybe it was a sixth sense.
The Backfire Effect
Even in the face of hard evidence, especially so, we won’t change our minds.
Remember identity-protective cognition? This is pretty similar. Given that the press presents most information in a 50/50 argument in order to strive towards objectivity (much to the irritation of those concerned with Global Warming), we tend to pick the side we err towards — just as with confirmation bias. And, yes, for a lot of media it’s barely the facade of being objective. This ambiguous way of delivering information means that we’re well trained to think in polar opposites — it’s either right or it’s wrong — which isn’t easy to escape or useful in finding truth. (Let’s ignore the philosophy of what ‘truth’ really is, for now.) It stands that with many debates there are grey areas, so misplaced objectivity is good to bear in mind. TL;DR: Just because a person disagrees with you, don’t assume they’re wrong. Especially when you consider the following.
When confronted with solid evidence that counters strongly held beliefs, the overwhelming reaction is to get defensive. This is known as the “backfire effect”, as studied by researchers at Dartmouth who found that “citizens are likely to resist or reject arguments and evidence contradicting their opinions” in the face of evidence from an omniscient source — a single person. For example, if you argue against someone who believes that cannabis cures cancer, and offer data to support that, chances are “big pharma” will get the blame. The classic response of, “well, I’ve not read enough about it” (read: “I am not going to change stance based on what you’ve said”), can be quite a conversation stopper and a means to hold position — the very definition of shutting it down. Sound familiar? You’re probably treading on toes too. If that’s the case, bringing up The Backfire Effect won’t be helpful, but you can at least feel vindicated. It’s either that or sticking your fingers in your ears and shouting “lalala” to graciously make your point.
The Not Invented Here Syndrome
We tend to be more critical of someone else’s ideas than our own.
This goes beyond the shut down to a new level of “why don’t we just do what you wanna do?”. Even though everyone knows the tenet, “two minds are better than one”, when one’s ideas are criticised, it’s a hard pill to swallow. Not to get all sonder-philosophical, it’s easy to get why it feels unreasonable to ignore your own wealth of data — anecdotal or otherwise. Empathetic or not, the main character syndrome is real. Sure, this one’s more philosophy than syndrome, but we’ve all been there: someone rudely foot-mashed over your very legitimate idea to replace currency with interpretive dance styled solely on Wacky Waving Inflatable Arm-Flailing Tubeman, and they did it rather unceremoniously with the remnants of slug on their Crocs.
The reaction is pretty automatic — you assume they are entirely incorrect. Whether you consider it an inherent form of tribalism, like being unwilling to adopt a foreign culture; a pride, jealousy, and ownership issue; or simply rejection based on a lack of understanding, we’ve all had moments of NIH. “Let’s not reinvent the wheel”, they said. But radial tires made it over eventually, yo. Although it’s hard not to apply a harsher standard to everyone else’s ideas over your own, it doesn’t hurt to have a quiet word… with yourself. Humility is pretty rad and dismissing change for the sake of holding on to your own ideas is not the best way to grow. You’re not Neo (she says to herself). No need to be a luddite. No need to be conservative. The more vantage points, the better the view.
The Gambler’s Fallacy
We misunderstand the maths.
Oh, you thought you had the odds, but it’s a dicey bet. (Sorry.) We’ve all done it, and it’s on account of our inability — perhaps genetic — to understand the complex maths of chance. We put extra weight on previous outcomes, as if they’ll affect future ones, because it just feels right. Consider this: you’ve just thrown 4 heads in a row. It feels as though that means it’s less likely you’ll throw another. But each throw is still 50% chance of either outcome: the past coin toss does not effect the future outcome of a fair coin toss. Why? Because the slim chances of that happening only existed before you tossed the coin — before your five coin tosses, it was indeed a 1/32 probability. Geddit? Each individual time you throw the coin, there’s still a 50/50 of heads or tails.
We’re hardwired to see patterns —we find meaning from nothing to evidence our beliefs— so it seems instinctive to assume previous coin tosses must have effect. Even though most of us don’t truly understand the concept of random, we even demanded that iTunes shuffle mode be made less random to make it feel more random — true randomness meant that two songs from the same album could well crop up next to each other, just as the same song can be repeated over and over. What we really wanted was to hear each song on a playlist shuffled completely and played once, maybe twice thrown in there to throw us, and so it was changed. Spotify had the same problem. Seeing patterns helped us become so ruddy clever, but also built in a few flaws along the way. TL;DR: Gambling is addictive.
Want more? Cool. Coming next: The Ambiguity Effect, Neglecting Probability, Negativity Bias, and what ever else I can lay my eyes on.