Thinking is hard work. The average adult brain consumes about 100 times the energy used to power a typical smart phone each day.
So, it should come as no surprise that a lot of the time when we are faced with taxing decisions, we unknowingly default to the brain's version of low power mode: heuristics.
Heuristics are essentially mental shortcuts that allow us to use things such as past experience to process large volumes of information and make decisions quickly. We're usually using heuristics when we make decisions based on 'intuition'. For example, if we see a menacing figure in a hoody on the street at night, we may cross the road.
But low-power mode is a double-edged sword. Heuristics are generally helpful in saving us time and effort. But they also leave the door open to cognitive biases - mental shortcuts that actually lead us down the wrong path.
Being aware of what prompts these biases can not only help you avoid making bad decisions, it can help you understand how other people think to negotiate outcomes where everyone feels like a winner. Here are a few common triggers of cognitive biases.
Say you get into a lift and all the people inside are facing the rear. Do you do the same, or face the doors, as usual? Overwhelmingly, people follow the crowd. This scenario is actually the basis of a famous Candid Camera episode that demonstrated the bandwagon effect in action. In short, we all have a powerful urge to do what others are doing, even if we don't know why they are doing it.
Some examples of bandwagon effect being leveraged in business include:
Asking someone to do you a favour makes them like you more. Yes, you read that correctly - they do you the favour.
This counter-intuitive little quirk is named after former US President Benjamin Franklin who popularised the power move back in the 1700s. Franklin revealed he defused tensions with a rival politician by asking to borrow a particularly rare book from the man's personal library. After reading the book, Franklin returned it with an effusive note of thanks and said the man, who had been an adversary, "ever after manifested a readiness to serve me on all occasions".
The effect was replicated by psychologists Jon Jecker and David Landy in a 1969 experiment where people were paid to participate in a spurious study. Half were then asked by a researcher to return the cash because the university was short on funds. Those asked to give back the money rated the researcher more likeable than those who had been allowed to keep it.
Experts speculate people form a positive impression of those they help because the brain understands we do favours for people we like, therefore we must like people we have helped.
Further studies suggest that to have maximum effect, the favour should be social not transactional (i.e. asking for advice rather than money), and it should occur soon after meeting someone.
According to the theory of Backfire Effect, presenting someone with facts that contradict a deeply-held belief will not convince them they are wrong. Instead, as the name suggests, it can actually entrench that belief more deeply.
One of the more famous studies found voters presented with negative information about a favoured candidate will often not only dismiss this information, but become even more supportive of the candidate as a result.
Understanding this cognitive bias is crucial for handling business negotiations where people often arrive with preconceived ideas and opinions. Humans are hardwired to dig in our heels.
Rather than correcting, gently prompt people with open-ended but targeted questions. This will allow them to identify potential issues themselves, which will make them more open to changing their mind.
Choosing between two options can be hard. Strangely, introducing a third can make it easier. That's the decoy effect in action.
The decoy - a tactic deployed by everyone from magicians to sportspeople - is essentially an object of distraction that allows for the real goal to be achieved more easily. Decoys can be effective in sales and negotiations, but use with care because it runs the risk of losing trust if too obvious.
In sales, the decoy is pretty straightforward. Consumers are often faced with a choice between two options: one at a high price packed with features and one at a lower price with fewer bells and whistles. It can be difficult to weigh up comparative value. However, when a third option is introduced - perhaps higher priced again, but with no extra features - it makes the original high-priced option seem like better value.
In negotiations, the decoy can be used slightly differently, inflating the importance of a minor issue to leverage a better deal.
People are hugely influenced by numerical reference points even if these numbers are completely irrelevant. Startling findings from several studies indicate that if people see a random high number before making a decision, they will skew to the high-end of possibilities.
In one experiment, judges were asked to roll a die before reviewing case notes and assigning jail sentences they felt fit the crimes described. Some judges rolled a die that always came up on three, while another group were given one that always rolled six. The group that rolled sixes consistently suggested sentences that were higher than the group who rolled threes.
You see it in action more explicitly in negotiations with low-ball offers that aim to anchor a payment to the lower end. Researchers conclude it can be incredibly difficult to guard against the subliminal influence of anchoring numbers.