The Outdated OS
Cognitive Biases

Shortcuts

Earlier we saw how our conscious minds are often led astray by our outdated emotional OSs. But emotional triggers aren’t the only outdated part of our OS.

Relative to our OS, our conscious minds aren’t very fast. We can’t multitask, we register things much slower and it takes a lot longer to process information.

Even if we did manage to squeeze out some extra speed, we still wouldn’t have the capacity to process the sheer amount of information we receive every single moment; a problem that is only getting worse in the modern world. There is simply too much to process, and a lot of noise that needs to be filtered out. If we had to consider every piece of information available, we would quickly be overwhelmed and incapable of deciding anything.

To make up for this, our OS evolved a whole set of mental shortcuts.

These shortcuts, called heuristics, are essentially information processing rules, allowing us to focus on particular aspects of a complex situation and ignore the rest.

They help us quickly make meaning of complex situations, connecting dots, filling in gaps and simplifying the information we receive so that we can come to rapid conclusions. Given that we have limited memories, they also help us prioritise what things to remember and what to forget.

As with emotions, these heuristics initially evolved as an evolutionary advantage, allowing us to make beneficial decisions and judgements quickly without our slow poke conscious minds needing to do much heavy lifting.

Unfortunately, just as with our emotional triggers, many of these shortcuts are now out of date and incompatible with the modern world, sometimes resulting in irrational lapses of thinking. This is like numbering all of your outfits and each day just automatically wearing them in sequence. Great shortcut, no conscious selection process needed; you instantly know what you’re wearing each day. It’s a real time saver and makes a lot of sense until you end up at your office presentation wearing speedos and sunglasses.

Depending on our GEBE, these shortcuts can vary from person to person, but many are common to us all; we are all vulnerable to some degree. This is the sweet spot where psychologists, behavioural economists and conmen spend their days figuring out how to influence people to buy products, behave in certain ways, and vote for specific candidates.

When these outdated rules fail to work as intended, they lead to systematic deviations from logic and rationality. These errors in thinking are called cognitive biases, and unfortunately skew a great deal of our thoughts and actions.

It turns out that there are many things that we know to be absolutely, 100%, without a doubt, evidently, indubitably, unquestionably and plainly true, that simply aren’t.

These biases cause us to miss out on important information, or conversely focus on things which aren’t useful at all.

In an overzealous effort to construct meaning from the firehose of reality, we find patterns where no patterns exist, or create meaning and associations where there are none to be logically had.

Our OS is also remarkably bad at math, and is quite terrible at estimating the likelihood of things happening when data is incomplete, to the ecstasy of casinos across the land.

Researchers have identified over a hundred of these cognitive biases to date. Some biases come into play only in groups, while others are present at the individual level. Some biases affect decision making, some affect judgement, some affect memory, some affect motivation and some affect your focus.

Confirmation bias

You’ve probably heard of some of these biases.

Confirmation bias for example is famously known as the tendency to look for and prioritise information that confirms our pre-existing beliefs. Given a sea of information, we immediately pick out and remember the parts of it which strengthen our argument and ignore the rest, or interpret it in a biased way. The evidence does not even have to actually support our position, we just interpret it that way. And if we do find information that contradicts our beliefs we either discard it as invalid or become forensic scientists, picking the information apart to look for any flaws (preferably valid ones but we’re not too fussy). It feels uncomfortable to consider things that are against our existing beliefs.

In other words, we tend to see and hear what we want to, or expect to.

A 2012 study by Dan Kahan et al involved showing subjects a video of a political demonstration. Half of the subjects were told that the demonstrators were protesting abortion, and the other half were told that it was a protest against the military’s ‘don’t ask, don’t tell’ policy.

Despite everyone seeing the exact same footage, what subjects saw diverged wildly depending on whether they agreed or disagreed with the demonstrators. Subjects that agreed with demonstrator’s views tended to find the protester’s behaviour perfectly acceptable, whereas those that disagreed found demonstrators to be uncouth barbarians screaming at bystanders and blocking access to buildings.

Not only do we see the world in a biased way, we tend to think that our views are reasoned and rational whereas those who disagree with us are woefully misguided. Our reasoning, and that of those who agree with us are well thought out, whereas those who disagree with us are simply following the crowd, acting out of self-interest or perhaps prey to cognitive biases, poor unwise fools that they are.

If you believe that supernatural creatures exist, and set out on a quest for evidence, it is likely you will find it. Equally, if you want to prove that they don’t exist, it is likely you will find that evidence too. In fact, it is even possible that the evidence you find in both cases is the same, just that your interpretation is different.

Just like the optimistic guy who asks for a date and is told ‘Not till hell freezes over’ but interprets it to mean ‘Sure, when the weather is cooler’, people often hear what they want to hear.

This is one of the reasons why a group of people can be exposed to the same evidence but each come back with strengthened conviction in their wildly different positions.

This also explains why some people can be remarkably overconfident and steadfast in their beliefs no matter what they are told, and why that friend of yours just point-blank refuses to see reason when you explain that Android is superior to iOS.

Take this book as an example, which seeks to cast a light on what it really is to be human. Some might find the facts discussed here uncomfortable at first glance. They question fundamental beliefs about who and what we are as human beings.

Many people will have strong feelings or opinions about these issues, for example whether they have free will. Confirmation bias will make it much more likely that you harbour extreme resistance to facts that run contrary to your pre-existing beliefs. We are all biased towards things that we want to be true, regardless of whether they actually are.

This is why facts can be surprisingly inadequate when you are trying to change someone’s mind.

Our OS has a built in resistance to being proven wrong; we instinctively search for reasons that we are right. In fact the smarter a person is, the greater their ability to creatively twist information and rationalise data such that it fits their opinions.

A straight on assault on people’s beliefs or positions rarely yields good results and is more likely to provoke them into doubling down, no matter how right you may factually be.

That is why, instead of taking a head on approach to proving how someone is wrong, it is much better to start by proving how they are right. Figure out points that you can agree on and make that your starting point instead. Use your common ground to reframe the issue as one of cooperation rather than contention.

Confirmation bias is an even bigger problem in the world of social media. Left to their own devices, people already seek out information that reinforces their own beliefs while ignoring dissenting viewpoints. Now we are being aided by intelligent algorithms that continuously learn what our preferences are and feed us precisely what we want to see. Two people who live as neighbours may see entirely different newsfeeds and be fed entirely contrary viewpoints of the same issue. Physically they occupy the same space but in cyberspace, they can be worlds apart.

This makes it ever more likely that we are each trapped in our own information bubbles where we see limited viewpoints and information. Without the ability to peruse diverse views, our confirmation biases go into overdrive and we become more and more polarised as a society.

Confirmation bias makes it hard to make optimal decisions since we only pay attention to the evidence we want to see. Instead of choosing the best choice based on the available evidence, we often just see the best evidence to support the choice we want.

Unfortunately, even though you now know about confirmation bias, and fully intend to be fair and objective, the odds are still against you. The blind spot in your OS is strongly ingrained and works to prevent you from seeing the world as different from the way you assume it is. This makes it quite likely that no matter what you just read, you haven’t learned anything different – it only confirmed what you already knew. And that isn’t your fault.

This was shown way back in 1979 by Charles Lord et al from Stanford University. Participants with strong pro or anti death penalty views were presented with evidence from both sides of the argument.

As you might by now guess, they found that the nature of the evidence was secondary to the participants pre-existing views. Anti-death penalty people became more anti-death penalty and pro-death penalty people became more pro-death penalty, no matter if they were shown pro or anti-death penalty evidence. They also rated evidence that supported their beliefs as being more methodologically rigorous and those that didn’t as being shoddy. Basically, anyone that doesn’t agree with me is an idiot, and those that do must be geniuses.

To counter confirmation bias, the researchers ran the experiment again, this time with clear instructions to the participants to consider themselves fair and impartial judges, and to weigh all the evidence as objectively as possible. Despite this, participants showed exactly the same biases as shown in the first experiment.

It seems that simply wanting to make unbiased decisions is not enough.

In an alternative run of the experiment, participants were asked to consider the evidence, but this time asking themselves whether they would have rated the evidence in the same way should the study have produced results favouring the opposite side of the issue. So if presented with pro death penalty evidence stating that the death penalty lowered murder rates, participants would need to consider if they would rate the research the same way if the conclusion had instead been that it did not lower murder rates.

This simple technique overcame the bias. Participants no longer rated studies in accordance to their beliefs, nor did they become more extreme in their views regardless of which evidence they read.

By forcing ourselves to think from a different perspective, we break through our inherent confirmation bias and the automated limitations of our OS.

When considering the validity of information, it is good practice to always consider the opposite perspective. Would we feel the same way about the evidence or issue or person if their conclusion took the opposite position? Are we embracing or disregarding the information before us because it is truly good or bad, or simply because it clashes with our existing beliefs?

It is also good to make an effort to seek out individuals, groups and news sources that make us insecure about our views, to counter our natural tendency to stay within a bubble of people who already agree with us.

The Dunning-Kruger effect

The Dunning-Kruger effect is a famous, but often misunderstood, cognitive bias, which is the tendency for low ability individuals to overestimate their ability as well as for experts to underestimate their ability. This happens because our OS isn’t very good at assessing relative standards of performance.

In David Dunning’s own words: ‘If you're incompetent, you can't know you're incompetent ... The skills you need to produce a right answer are exactly the skills you need to recognize what a right answer is.’

In a set of studies testing this effect, participants were given various tests and then asked how well they thought they had done on the test. Participants with the lowest test scores tended to overestimate their abilities the most, with the bottom 25% of scorers believing that they were in the 62nd percentile. Participants with the highest scores on the other hand tended to underestimate their abilities relative to other participants.

Incompetent people don’t know enough to know that they are incompetent and hence make generous errors in judgement about themselves. Believing themselves to be experts they become overconfident and oblivious to gaps in their expertise. Even worse, poor performers tend to be more resistant to feedback because they believe they already know everything.

Experts on the other hand go the other way, being much more likely to underestimate their abilities because they assume that tasks that are easy for them are also easy for others. While the incompetents make errors in judgement about themselves, the experts make errors in judgement about other people. Experts also tend to be more aware of the vast complexity of their field and hence are very aware of how much they actually don’t know, which makes them less confident.

We’re all vulnerable to this effect, and as standard with cognitive biases, we’re not usually aware of it.

This effect is increasingly a problem in modern society since we find ourselves surrounded by incredibly confident and assured people with remarkably little competence, whereas those with actual expertise tend to be filled with doubt and indecision.

This state of affairs is compounded by another cognitive bias, the attribute substitution bias, which allows us to make complex, difficult judgements quickly by substituting them with easier, but not necessarily accurate, judgements – in this case by substituting competence with confidence.

It is much harder to judge competence than it is to see confidence, and hence it seems that it could be a useful proxy. If he’s so confident, surely he must be competent?

Unfortunately, with the Dunning-Kruger effect in tow, people are remarkably good at being confident in matters they know nothing about, like a backseat driver who instructs you on the finer points of driving despite not having a license himself. (A 1986 study found that 80% of drivers rate themselves as having above average driving skills, which is somewhat interesting mathematically speaking).

Similarly, people tend to make the assumption that experience equals competence, despite the very obvious evidence otherwise. If simply having done something for a long time meant you were good at it, then every 70-year-old on Earth would be a veritable genius at whatever it is they do, which clearly is not always the case. Doing something badly for 30 years just means that you’ve become really proficient at doing it badly. Despite this, society’s hiring and promotion system hinges on the evaluation of experience, since it is much easier to assess and verify than true competence.

Just because someone has a tennis racquet doesn’t make them Serena Williams, just because someone is old and experienced doesn’t necessarily make them an expert, and just because someone is confident doesn’t mean that they are competent.

Even great success in a given field does not automatically make a person an authority on all matters, and yet it is not uncommon for a businessman who has struck gold to suddenly transform into the wise man of the North, spouting his pearls of wisdom at anyone within earshot. Celebrities, politicians and businessmen regularly dispense their opinions on subjects ranging from relationship advice to cleaning hacks to health tips, despite them generally being no better equipped with expertise in these areas than the average person. Blinded by our biases, many of us lap up this wisdom without realising some of it can be wrong or even harmful.

Even more cruelly, social media makes it easy for random quotes to be misattributed to famous people, lending immediate credibility to nonsensical statements. As Mother Theresa once never said 'Dont be no fool listening to dem lies, innit'.

The combination of Dunning-Kruger and the attribute substitution biases goes some way to explaining why the world is filled with rather a lot of people of mediocre ability in positions of power and responsibility, and rather a lot of experts who are not.

Perhaps Shakespeare said it best: ‘The fool thinks himself to be wise, while a wise man knows himself to be a fool.’

It is very important to know what we don’t know, as well as to recognise that most of us don’t know all that much about most domains in our lives. It is simply impossible to be an expert in everything, so why should we act like that is the case?

Without the intellectual humility to know what we don’t know, we cannot learn, because you cannot learn something if you think you already know it.

This can be difficult for us to accept because of our old friend confirmation bias; overcoming the Dunning-Kruger effect requires us to realise that we were previously mistaken and to think about how our conclusions might be misguided.

Some of you may reject the very idea that you could be overconfident in any area. ‘I’m not overconfident, I’m just a badass.’ Chances are that’s not quite true.

Learning is a good way to overcome the Dunning-Kruger effect. The more we know about a subject, and the more competent we become, the more able we are to know what we don’t know.

Dunning and Kruger tested this out by having participants take a logic test and estimate how well they did. Participants were then given a training session on logical reasoning and asked once again how they’d done on the previous test. Participants who scored in the bottom 25% consistently lowered their estimate of how well they thought they had done. More knowledge in the subject area had at the same time made them more competent as well as making them more aware of what they lacked in knowledge.

Maintaining an open mind is key here. Always have the intellectual humility to question yourself and think about how you might be misguided. Ask for feedback. Know that your OS is working very hard to convince you that you are right about things you may not be right about. If in doubt, and perhaps even more so if not in doubt, learn as much as you can about a subject, taking opposing views into account, before concluding you are right, especially when stakes are high.

You’re probably bad at probabilities

The gambler’s fallacy is an example of a bias that makes us extremely bad at estimating probabilities. We automatically try to infer future events based on past events, even in situations where the past has absolutely no bearing on the future. We also tend to believe that events will eventually even out in a fair world.

A typical example is coin tossing. We all know that assuming everything is fair and square, there is a fifty-fifty chance of a coin coming up heads or tails on any given toss. It doesn’t matter how many times you toss the coin, it still has a fifty-fifty chance of landing on heads.

But say you have just witnessed the coin landing on tails 10 times in a row. Given the chance to bet that the next toss will land on heads, many people would be very much inclined to think that their chances are very good.

In fact, it’s quite likely you are thinking that right now. 10 times in a row! Surely the odds against the next toss landing on tails again is astronomically small? You will likely feel this way even though you logically know in your head that the odds for the next toss are indeed still 50-50. The coin does not feel embarrassed that it has only been showing its tail and feel the need to suddenly switch sides. Each toss is an individual event that is completely independent from previous events. The past has absolutely nothing to do with the future in this case.

Unfortunately, your OS has trouble distinguishing between the truly unlikely event of getting 11 tails in a row, which is 1 in 2048, and the probability of the next toss coming up heads after the first ten events have already occurred, which is just 1 in 2.

In other words, if someone bets you that they are going to toss 11 tails in a row, it’s a good bet to take. But if they simply bet you that the next toss will be either heads or tails, then you have even odds, irrespective of what happened previously, whether there were 11 tails in a row or 1000.

Despite what you have just read, some of you will still feel an overwhelming urge to bet your life savings that tails won’t come up 11 times in a row. It can't possibly happen.

This actually did happen at the Monte Carlo Casino in 1913, when a roulette table came up black 26 times in a row. Gamblers lost millions of francs betting against black, reasoning that surely the next spin couldn’t be black again. Incidentally, the odds of 26 blacks in a row are about 1 in 66.6 million. But the odds of black (or red) occurring on the 27th game were just under 1 in 2.  

It is also natural for us to assume that the odds of 26 blacks in a row are uniquely unlikely. But probability doesn’t care about colours. The odds of 1 in 66.6 million are exactly the same whether it is 26 blacks in a row, 26 reds in a row, 13 blacks and 13 reds in a row, alternating blacks and reds or any other random combination. There is nothing special about any of these combinations, it is simply that the odds of any specific sequence of 26 reds and black will always be 1 in 66.6 million.

People make this mistake with all sorts of things, including childbirth where they assume that since their first four kids were boys, surely the next one must be a sweet little girl, possibly explaining why you see exhausted looking fathers carting around truckloads of boisterous boys.

Experiments have shown that (you guessed it) merely educating people about this bias is not sufficient in combating it. Despite being taught about probabilities and randomness, people are still innately inclined to believe in this fallacy.

A much easier way to combat the fallacy is to force yourself to completely bypass it. If you know an event to be independent of past events, then simply ignore everything about the past, and always treat each event as a new beginning. Forget everything that happened before, whether it’s the result of coin tosses, children or weather patterns. In this way, you are less likely to be swayed by your outdated OS.

Gut feelings

The halo effect is a bias where we take one positive or negative attribute of a person and let that influence our impressions of everything else about that person. Most commonly this happens with physical attractiveness, but it can also be triggered by other impressions such as a likeable personality.

Under the influence of the halo effect, we see a good-looking person, and automatically assume that they are successful, likeable, intelligent, kind and funny, even though attractiveness is not necessarily correlated with any of these characteristics. Conversely, we might see an unattractive, overweight person, and automatically assume that they are lazy, unsuccessful and greedy, equally without basis.

Celebrities and politicians tend to benefit from this effect greatly, gaining credibility that has little to do with their actual attributes.

This is why celebrities are so useful as product endorsers. There is no logical reason to believe that your favourite movie star is any better informed than anyone else about the benefits of Brand X foot creams, and yet we assume that they have access to secrets beyond mere mortals simply because they are famous and/or attractive. Our positive evaluations of the celebrity can even spread to our perceptions of the product itself.

Teachers tend to see well behaved students as also being bright and diligent well before they have objectively established these traits and vice versa. This can affect their expectations for the students, and the way that they treat them, resulting in a self-fulfilling prophecy (This is itself known as the Pygmalion effect, whereby the high expectations of other’s, whether conscious or unconscious, cause you to perform in a way that is consistent with those expectations. The opposite is also true, called the golem effect, whereby low expectations lead to a decrease in performance. Unwarranted expectations therefore can cause much more harm than just silent judgement.)

At work, managers may latch onto a single positive or negative trait of an employee and base their entire performance appraisal on that trait, irrespective of the employee’s actual ability to perform.

In interviews, attractive candidates are more likely to also be rated as intelligent, competent and qualified.

This even works in favour of criminals. One study showed that jurors are less likely to believe that attractive people are guilty of criminal behaviour.

In 2007, Charles Ballew II and Alexander Todorov published research showing that purely based on snap impressions of facial appearances, people were able to predict with about 70% accuracy the result of political elections. Participants did not have any prior knowledge of the candidates, and were only shown the candidates faces for durations of between 100ms to 2s. This implies how quickly we use faces to infer people’s personalities and capabilities.

Obviously, when the decisions we are making involve choosing our leaders, mistakes can be extremely costly, and considerably more consequential than choosing the wrong foot cream.

Of course, we don’t just judge faces for attractiveness, we also look at everything from people’s heights, weights, curviness and waist-hip ratios.

In 2018, Ying Hu et al created 140 realistic body models and asked research participants to infer personality traits just by looking at the bodies alone.

Shapely bodies, including muscular men and pear-shaped women, were assumed to have both positive and negative traits, including being enthusiastic and confident as well as irritable and quarrelsome.

Rectangular bodies, with undefined waists and few curves, were found to be trustworthy and easy-going.

Heavier people were generally rated less positively, with assumed traits such as carelessness and incompetence.

Such unfounded judgements cause many problems in the modern world. Most people understand that there is no logical connection between such traits, and yet our OS hands us spontaneous stereotypes that colour our feelings and thoughts without us even realising it. We treat people in unfair ways based on entirely false assumptions.

Much like the other biases, simply being aware of it isn’t enough to overcome it. We need to make an active effort to question ourselves and our judgements.

If you catch yourself putting too much stock in someone's words simply because they are good looking, famous, rich or old or disregarding someone else’s opinion because they are not, try and take a step back and measure their words objectively.

Why do we feel this way about this person? What evidence are we basing this judgement on? Would we feel the same way if the person did not have those traits? Would you feel the same way about their opinions if someone else had said them?

Gut feel is often lauded as a good thing, but when your gut is operating on outdated rules, it is far better to rely on logic and reason. Going with your instincts and feelings can and will lead you astray. Listen to your gut, but evaluate what it has to say.

Don’t do what feels right, do what is right.

Hop on the bandwagon

The bandwagon effect is another bias that can cause grave problems.

We like to follow the crowd. The more people start to favour something, the more we do too. Our brains tend to shut down and enter groupthink mode, regardless of evidence or reason.

The effect is a by-product of our inbuilt need to conform and be accepted by other people. We want to be part of a group, and to be included rather than stand out alone. Being socially excluded is a risk that our OS very much wants to avoid and thus it tells us to sit down and not rock the boat.

Now, chances are, like most people you don’t particularly feel that you are a conformist. You have your own opinions and aren’t afraid to disagree when you believe you are right, right?

Yet research once again suggests that people are much more prone to biases than they believe or realise themselves to be.

Solomon Asch et al conducted a series of experiments in the 1950s that showed how much our opinions can be influenced by others.

Groups of eight people were seated together. Unbeknownst to the actual subject, all of the other participants in the group were actors, instructed to perform the task in a certain way. The group was seated such that the real subject always responded last, after seeing how everyone else had responded.

The group was shown a line segment and asked to choose the line of matching length from another three segments of different lengths. One of the three lines was the same as the first, and the other two lines were clearly shorter or longer, with such little ambiguity that a 100% correct response rate could be expected. In other words, it was so easy that you would expect even a half blind person to score full marks.

Each of the participants were then asked to say aloud which line they chose, taking turns according to their seat. The actors were given instructions on how they should respond. While they would always respond unanimously, sometimes they would give the correct response, and other times they would all pick an incorrect one on purpose.

In such conditions, 75% of the test subjects were swayed by the actors to give at least one incorrect answer. There was considerable variation though, with only 5% of participants always being swayed by the crowd and 25% consistently standing their own ground. The majority lay somewhere in between.

When interviewed after the test, it was found that most people had experienced a distortion of judgement, where they concluded that they must be mistaken and that the majority must be right. Others knew that the rest were wrong, but conformed anyway in fear of facing ridicule. In some cases, subjects were so convinced by the power of the crowd that they were apparently not even aware that the actors were giving incorrect answers. They just assumed them to be correct.

Even those who consistently bucked the trend said that they had felt doubt.

Given these statistics, if you had been a subject in the experiment, it is not hard to imagine you might have wavered too.

Our innate need to conform is strong. As seen in these experiments, people were literally willing to ignore reality and give answers that were obviously incorrect.

These results have since been corroborated by a series of other experiments, all with similar conclusions.

The bandwagon effect grows stronger the more people are involved, although as few as four or five are sufficient social proof to get people to play ball. You don’t even need to see or know the other people. Simply being told that other people prefer something makes it more likely that you will choose to do the same thing. For example, imagine you’re choosing between two flavours of ice cream. If I were to tell you that most people choose the second one, it’s likely you would feel more inclined to do the same. If it’s popular it must be good right?

Conformity also increases when the task becomes more difficult, as uncertain people automatically look to others to lead the way.

Authority certainly plays a role as well, with conformity increasing when other members of the group are of a higher social status.

The bandwagon effect is responsible for fashion trends, music trends, groupthink in meetings, viral fads, memes, and you buying the latest greatest gadget that you actually have no use for simply because all your friends did. Ultimately,all these things boil down to people following people: how they dress, what they listen to, ideas they espouse, things to care about, things to be offended about and so on. If enough people or the ‘right’people do it, you’ll want to do it too.

Social media is a prime example of the bandwagon effect at work. Every time someone likes, shares or even views a post, they make it more popular. Knowing that other people like the post, it becomes more attractive, to humans as well as to algorithms.

While these issues are fairly mild as far as cognitive bias problems go, the bandwagon effect can have some catastrophic effects.

The investment world is rife with examples of people who lost their pants and everything in them after following the bandwagon. Simply investing in something because other people are doing it is generally not a sound investment strategy. Even professional investors are not immune to this, piling on enthusiastically into the various dotcom, property, gold and cryptocurrency bubbles that have arisen over the years. Despite there being clear warning signs, people on the bandwagon can always justify their reasoning to themselves. After all, everybody can’t be wrong. Until they are.

The world of alternative medicine is another area where the bandwagon effect can be dangerous. It only takes a few acquaintances or a respected personality to make well intentioned suggestions that cause fatal harm. Popular underground remedies may turn out to harm more than they cure, if only because they are sought out instead of proper treatment. Likewise, medically sound therapies such as vaccinations can be maligned unduly, causing patients to reject appropriate treatment.

However, in terms of sheer reach, politics is perhaps where this bias can have the most damage.

Ever noticed how political candidates try as hard as possible to make it seem that they are leading in the polls? This is because some people follow the crowd and vote for whoever appears to be winning, just like how some people choose which football team to support based on who the current champion is.

This is also why political affiliations tend to be clustered geographically. You are influenced by people around you and are much more likely to vote alongside them than against.

Even more dangerous is the bandwagon effect’s role in stirring up a community to commit crimes, engage in war, or partake in genocide.

Riots are an example of this, where just a few initial trouble makers can seed full blown chaos. Mob mentality is the bandwagon effect in full force, rousing primitive urges within otherwise normal people to ignore their better senses and loot, riot or worse.

The Holocaust was a masterclass in the dark side of the bandwagon effect, to terrible consequences.

Many Germans were not overly thrilled about Hitler’s rise. Many did not agree with the policies he espoused. But slowly, as he gained followers, the bandwagon effect came into play.

Impassioned speeches, parades, posters, banners, songs and other pop culture gave the impression that the entire nation had bought into Hitler’s ideologies, and were the better for it.

The more people that bought into his ideas (or at least appeared to for the sake of conformity), the more pressure there was for others to conform too. People who disagreed started thinking that maybe they were mistaken after all or that they had better conform for fear of social exclusion. They started joining the bandwagon and suspended their better judgement for the sake of conformity.

The further along it went, the more pressure there was to conform, and the worse the penalty for not conforming. While the bandwagon effect was of course only one of the many complex factors leading to the Holocaust, it certainly didn’t help.

Without such a strong innate need to conform, many movements of this nature would never be able to take off in the first place, since clear headed people would refuse to join at the outset, denying the movement critical mass.

It is impossible to suddenly have an entire nation of people galvanised overnight to commit genocide. We have seen repeatedly in history, whether it is the Holocaust, the Rwandan genocide, the Cambodian genocide or Darfur that it always starts out small and spreads, sometimes over years, sometimes rapidly, but there is almost always room for right minded people to stop it in its tracks early.

Should more people understand their cognitive weaknesses and resist the urge to conform unthinkingly, the bandwagon can be dismantled when such movements are still nascent.

There’s a sucker born every minute

Imagine that you have been summoned to meet the Great and Mighty Wizard of Om at his palace, where he will use his divine powers to reach into your soul and tell you who you really are.

Entering a cavernous hall, you join a crowd of eager people from all walks of life, all waiting to be assessed by the Mighty Om. You quietly wait, until finally it is your turn, and you enter the great one’s chambers.

Despite being indoors, there is a thick layer of mist around, and suitably impressive occult looking objects dot the room. The GMWO approaches you, shrouded in a mysterious cloak, and waves his hands around stylishly, sensing your aura and sniffing your essence. He frowns in concentration as he makes alternating low and high pitched grunts and shudders slightly whenever he receives particularly potent waves of psychic energy from you. After a few moments, an assistant indicates that he has successfully read your mind and you are ushered out.

Using his great powers, the GMWO writes a personalized personality profile for each of the attendees and delivers it to each of them in wonderful and mysterious ways. Yours happens to be delivered to you in this book, which the GMWO in his omnipotent wisdom knew that you would come to read.

It reads as follows:

When you came to visit me, you were uncertain and somewhat skeptical. This is to be expected as you are above average in intelligence and generally an alert person. You are an independent thinker and like to ascertain claims for yourself before blindly believing what someone tells you. At the same time, you are open minded, and accepting of new ideas when they make sense.
Despite the image you show to the outside world, you can be insecure inside. You do have a tendency to worry too much at times, but most of the time you keep it under control. You are often your worst critic, seriously doubting whether you have done the right thing or are making the right decisions. Sometimes, you feel quite down, but generally you try to be positive. You are a good person although earlier in life you have had to struggle with yourself to control your temper and unhealthy impulses.
Although you enjoy meeting people that you know well, social situations can sometimes cause anxiety for you. You often feel people get the wrong impression of you and don’t understand you. You have thus found that it is often better to be less forthcoming in revealing yourself to others. At the same time, you have a need for other people to like and admire you, the pursuit of which can be a source of further anxiety. You tend to fear rejection and you worry that this has caused you to shy away from experiences and opportunities in life. When you are comfortable you can be extroverted and sociable, while other times you can be guarded and wary. If somebody breaks your trust, you find it hard to let go of the deep sense of injustice that you feel.
Your parents did what they believed to be their best for you but often interfered too much. As a result of their influence, you have developed some personality traits that you wish you did not have.
You find it hard to save money. Security is one of your major goals in life.
You enjoy and appreciate art, painting, music and movies, but unfortunately will never be successful as a professional in these fields. Likewise, you feel you should be doing more athletically but find it hard to commit. You do have creative abilities that you have not been able to explore fully and that bothers you.
In general, you have a tremendous amount of untapped potential that you have not been able to make the most of. You have a lot of great ideas that you just never have the time or motivation to follow through on.

On a scale of 0 to 5, how accurate was this profile for you?

The Forer Effect is named after Professor Bertram Forer who formally described this bias in 1948, and is more popularly known as the Barnum effect after P.T. Barnum, due to his famous phrase ‘There’s a sucker born every minute’ (which he quite possibly never said).

Dr Forer conducted the original experiment by providing students with individual personality surveys and telling students that he would analyze the surveys and give them back personal feedback in the form of a brief personality profile, similar to the profile prepared for you by the GMWO.

Students then evaluated the feedback quality on how accurate they found it, with the average score being 4.26 out of 5. The twist was that every single student received the same general personality profile, regardless of what they had filled in for their survey and yet most people still found it strangely accurate for themselves. This experiment has since been repeated many times in various forms, always with the same results.

The Forer Effect causes people to believe that generic descriptions apply specifically to themselves. Since everyone is the hero in their own stories, and people are much more alike than they think they are, it is quite natural that people identify with descriptions that aren’t actually tailored for them. This is especially true when it comes to positive statements, since people are generally more than happy to be complimented and believe positive statements about themselves.

This bias is used to great effect by astrologists, fortune tellers, psychics, personality quizzes and your daily horoscope, allowing them to trot out universal descriptions that many individuals find shockingly specific and accurate, often leading them to believe that something supernatural or spiritual is involved. Indeed, some psychics employ these traits so convincingly and intuitively that they start believing in their own powers themselves. Under controlled conditions however it has been conclusively shown that these effects can be attributed completely to the Forer Effect.

In modern times, companies often use this effect as well, providing ‘personalized’ products and services to their customers such as newsfeeds, video and movie recommendations and music playlists that in fact can be quite generic apart from some obvious filtering. By employing these methods, companies make customers feel special, and that their unique tastes are being catered to, whether or not this is actually the case.

As always, simply being aware of cognitive bias effects does not necessarily protect us from their illusions. Keeping the effect in mind is a good starting though when evaluating any such statements that you come across. Certainly, understanding and recognizing these techniques at play when dealing with fortune tellers or corporations can help reduce their sway on you. At the same time, it needs to be noted that the Forer Effect is just one of many techniques used by such professionals, but generally is a solid base for them to start from.

There is however a very nice take away from the fact that this bias can exist, which is that we really aren’t all that different after all. If oddly specific statements can be so generally applicable to the majority of people, it shows us how universal the human experience actually is. Generally speaking, we all have the same worries, the same anxieties, the same hang ups, the same hopes and the same core humanity. We are all much more similar than we are different and there is something quite wonderful about that.

Remembering what never happened

People’s memories are affected by cognitive biases too. Generally, our OS is designed to pick out a few items from an experience to represent the whole. We discard specifics to form generalities and even unintentionally edit memories after the fact.

These pieces of memory are then broken up and sent to different parts of the brain for storage. A single memory can involve millions of neurons distributed across the entire brain. When we eventually need them, memories are reconstructed from these fragments rather than retrieved as a whole, somewhat like reconstructing your bedside table from pieces that you kept in storage rather than just bringing out the entire table from storage. Naturally, these pieced together reconstructions can be flawed. There are often a few screws missing and occasionally you may find that you’ve included parts which have no business being in the memory/table at all.

These false or biased memories then go on to feed other cognitive biases in a cycle of increasingly flawed thought processes. If you can’t trust your memory, how can you trust conclusions that you make based on your memory?

Elizabeth Loftus of the University of California at Irvine is a specialist in false memories. This is an important area of study because falsely remembering events can have devastating effects, especially in areas such as criminal justice, where there have been hundreds of cases of innocent people being wrongfully convicted based on falsely remembered eyewitness testimony.

The problem is that we are incredibly suggestible, and since our memories are reconstructed from the fragments we have at hand, any suggestions presented to us, however innocently (or not), may result in a distortion of the facts.

Experiment after experiment has shown that people routinely create and recall false memories for themselves, and that false memories can also be purposely implanted by others. Just asking questions using different words can cause people to remember events differently.

For example, participants shown a simulated accident were asked how fast the cars were going when they hit each other. Other participants, shown the same accident, were asked how fast the cars were going when they smashed into each other. The ‘smashed’ group were not only much more likely to remember the cars as going faster than the ‘hit’ group, but were also more likely to report remembering broken glass at the scene of the accident (there wasn’t any). The scene reconstructed in their memories was more violent, simply because the word ‘smashed’ suggested it to be so.

People generally aren’t able to identify their own false memories. By definition, if you remember something, you remember it. The best clue you might have is if you feel uncertain about the memory (although confidence in the memory is no guarantee that the memory is correct, nor is the level of detail present). Generally speaking, if a memory is in question, the only way to be sure is to have it be corroborated by evidence or other people.

There are many other ways in which our memories can play tricks on us.

Duration neglect is an interesting example of a memory bias, where we don’t factor in how long an event takes when we consider it.

When it comes to memories of pain for example, it seems that we don’t really care how long our agony lasted for. The only things we retain are the worst moments and the final moments of pain.

In an experiment done by Redelmeier, Katz and Kahneman, patients undergoing colonoscopies were randomly split into two sets. Half the patients were given a normal colonoscopy, whereas the other half had the same procedure, except that at the end of it the surgeons left the tip of the colonoscope in the patient’s rectum for an extra interval.

Since they weren’t doing anything other than leaving the scope in, the discomfort experienced in these final moments was considerably less than the rest of the procedure, despite drawing out the overall experience.

Now all things being equal, a reasonable person should logically want to minimise the duration of the procedure and their discomfort. You probably wouldn’t be too happy if you found out that your surgeon had left something stuck up your butt and gone off for a coffee.

And yet because of the duration neglect bias, what patients remember are simply the final moments of their experience rather than the total duration.

Compared to the standard group, patients who underwent the extended procedure reported the entire experience to be less unpleasant overall. This in turn made them more likely to return for repeat colonoscopies.

Similar experiments showed the same thing; by ensuring that the final moments were relatively pleasant, people’s memories of pain could be considerably lessened. It is not what actually happened that matters; it is what the brain perceives and remembers that matters.

Plenty of biases to go around

Even things that are so common to us as to seem a routine part of daily life are technically cognitive biases. We make these mistakes so much that we just accept them as being normal. These logical fallacies include:

Procrastination, whereby we quite unreasonably decide to favour momentary present pleasure instead of greater benefits in the future.

Authority bias, whereby we automatically follow the lead of authority figures, even if they have no particular expertise in the area in question. Our mindless deference to authority extends even to those who don’t actually have any real authority, only the appearance of authority, for example someone wearing a uniform or even just someone who carries themselves with authority.

Stereotyping, whereby we automatically assume a category of person to have certain characteristics despite knowing nothing about them as individuals. This can have very dark consequences in the form of prejudice and discrimination towards certain groups of people. While stereotypes can have a grain in truth in them on average, it is completely illogical to then apply those average assumptions to individuals across the board, and even more ludicrous to act on such unfounded assumptions by behaving in prejudicial or discriminatory ways.

Status quo bias, whereby we prefer things to stay the way that they are. Even though the current situation may be far from ideal (or straight up bad), we tend to take that as a baseline, and any change from that is perceived emotionally as a loss. Whether it’s changing your health plan, your utility provider, your insurance coverage, your medical treatments or the colour of your car, there is a strong tendency for us to stick with the status quo, even if the alternative might prove more beneficial.

Survivorship bias, where we severely misjudge situations because we only focus on the successes, i.e. the survivors. Since history is written by (or about) the winners, people that did not survive (figuratively or literally) tend to lack visibility and we hear much more about the glorious successes.

This gives people overly optimistic beliefs, since they only see success after success. This also leads to the often false belief that the people involved in successes are exceptionally talented and better than those that did not succeed, completely negating the many uncontrollable factors that go into most successes and failures.

This is why after seeing countless news items about college dropouts going on to found multi-billion dollar companies, it appears a perfectly reasonable thing to want to go and do, despite the fact that in reality there are countless more failures than successes, and a tremendous amount of luck and timing are involved in being one of the rare successes. The same thing applies to aspiring actors or singers, who see lots of hugely successful megastars but none of the equally talented ones who never made it.

The scarcity effect, where something becomes more valuable if it has the appearance of being scarce or unattainable. People want more of the things they can have less of, regardless of the merit of the actual object or service in question.

The placebo effect, whereby simply believing that something has an effect causes it to have that effect. A common example of this is seen in medicine, when patients recover from an ailment because they believe they have received a fancy drug when in reality they have just taken a sugar pill.

The more impressive looking the placebo, the better the effect. Placebos in capsule form tend to have a better effect than placebos in pill form, whereas placebo injections beat out placebo capsules, and impressive looking placebo medical devices and machinery beat out everything else. Naturally more expensive and nicely packaged placebos are moree ffective than cheaper ones, and more placebos are better than less.

Even though none of the placebos have any actual medicinal value, they can make people feel better. And even though all of the placebos are equally useless, somehow different forms of uselessness can help more or less. Unquestionably, the effect of placebos lie purely in our own perception of how effective the treatment is and the brain does the rest, releasing substances like opiods and dopamines and triggering the immune system, which really do make us feel better.

One strange phenomenon researchers have been noticing is that the placebo effect seems to be getting stronger over time.A study on clinical trials of antidepressants from the Journal of Affective Medicine found that the placebo effect was twice as strong in 2005 than in 1985.

Its evil twin, the nocebo effect, is the opposite. An example of this is when a patient is given a sugar pill, told the drug has terrible side effects and then promptly begins to experience those very symptoms. Just imagining that something is happening is enough to set up self-fulfilling expectations in the brain, regardless of reality.

This presents an ethical dilemma for doctors and nurses. If you provide full disclosure and inform patients about every potential risk and side effect they might encounter from a given treatment, patients may well go on to experience each and every one of those effects just because you told them so. On the other hand if you don’t tell them about these risks and effects, you’re violating informed consent laws and looking at the wrong end of a malpractice suit.

Always think twice

By now you get the idea – cognitive biases are plentiful, each of them presenting their own obstacles to clear thought.

While it seems to us that we see the world as it truly is, we really see the world as we wish to, or rather how our outdated OS wishes to.

This of course is a problem, because we rely on the ability to make rational judgements in every aspect of our lives. Our social institutions operate on the basis that its member individuals are capable of making unbiased, rational decisions.

When we fail to do so (and we fail to do so every single day and twice on Sunday), bad things can happen.

The good news is that our irrationality is somewhat predictable, since our biases tend to follow simple rules.

The conclusions we reach via these biases are not by any means set in stone. It is not that we are incapable of reaching better conclusions, it is simply that these are the defaults that our OS present us with. Once we understand that this is the case, knowing what the flawed rules of our OS are, we can take an extra moment to ensure that we aren’t being blindly led by our outdated OS’s down the garden path.

This does require consistent effort. The very failure to recognise your own cognitive biases is in fact a bias in itself called the blind spot bias. While we can quite happily recognise biases in other people, we typically have a hard time identifying them in ourselves. So if you’ve been reading this, nodding your head and thinking what idiots everyone else are, you’ve just demonstrated blind spot bias. Welcome to the club.

Often, multiple biases will be at work together, resulting in a compounding effect that makes us completely blind to objectivity. You’ve probably met some people who have demonstrated this to great effect, and seem completely incapable of objective thought.

Recognising that these automated rules of thought exist, and understanding how they work is the first step towards freedom from them (or as close as it gets).

The bad news is that this is not enough. There is a body of evidence showing that since cognitive biases are ingrained in us and arise unconsciously, simply being aware of their existence does not necessarily make them easier to detect, let alone mitigate them. Active efforts must be made.

How many psychologists does it take to change a lightbulb?

Just one, but the lightbulb must want to change.

Various experiments have shown that training can effectively debias decision makers over the long term.

Try and make an active effort to notice these biases in yourself. Every time you make a snap judgement or thought, take a pause to examine your gut reaction and ponder whether it is rational. Why do we feel this way about this person, object or situation? Is it true?

Even if you do think you have a good reason to feel that way, question whether your evidence is sound. Are you evaluating your evidence objectively? What if your evidence pointed the other way? Are there other plausible reasons for an event rather than our initial assumptions?

If you don’t have a good reason for thinking the way you do, or you recognise your reasoning as matching one or more cognitive biases, you will know to take a step back and rethink things.

Since we are by nature self-centred beings, an effective strategy when making decisions that can affect other people is to take the perspective of people who will experience the consequences of your decision. This works even if ‘other people’ is just yourself in the future. Participants in an experiment who were shown aged images of themselves upon retirement were more likely to choose to save money for the future than to receive it immediately.

Once you get in the practice of identifying cognitive biases, you will start seeing them everywhere, and begin to correct biases in your thinking and make more effective choices and evaluations. While completely overcoming these biases may be ultimately impossible, we can certainly benefit by lessening their frequency and impact.