Do you know someone who always makes the same mistakes and can’t explain why? Do you always feel confident when making a decision?
This may be because of Biases.
Biases make it difficult for people to exchange accurate information or derive truths. Biases lead us to avoid information that may be unwelcome or uncomfortable, rather than investigating the information that could lead us to a more accurate outcome. Biases can also cause us to see patterns or connections between ideas that aren’t necessarily there. I discovered behavioural psychology while studying finance, and since behavioural psychology affects us all regardless, I will try to describe some of the most common behavioural biases we are subjected to every day without even realizing it.
The baseball player example
A step back. To make decisions is necessary to estimate an event (acquisition of information, information processing, issuing a judgment probabilistic). Individuals formulate these estimates based on Heuristics, these are rules and operational conduct which are used to resolve in easy way problems complexes.
The baseball player example is a baseball player who, when he throws a ball, is unable to perform all the complicated calculations needed to calculate where the ball will go to fall. The scientifically correct method would require a complex system to be solved in mind of differential equations, but not for that he immobilizes. He sets out a simple heuristic rule: when the ball is up, he stares at it and begins run, adjusting the speed of the run to keep the angle of view of the ball constant, ignore all other information to calculate the trajectory, such as the initial velocity of the ball, the distance, and angle to focus only on one piece of information. Faced with so many situations in life, people normally behave just like baseball players, because of a lack of time, information and methods to carry out a scientifically correct calculation. That’s why, like a baseball player, people run for the balls adjusting the speed of the race in a way to maintain the angle of the gaze constant, but they don’t always manage to catch the ball.
There are many heuristics, for simplicity we will talk about the three main ones identified by Daniel Kahneman and Amos Tversky(1974):
• Representativeness: the opinions of odds tend to be based on stereotypes and familiar situations. Representativeness is a source of objective errors since stereotypes are usually poorly informative.
• Availability: Estimates are based on the frequency of memories. Events where the chance of winning is a lot unlikely, such as winning in the lottery, are often overestimated because at least one always wins, making the event more available in our mind.
- Anchoring: which could be quantitative or qualitative, occurs when the estimates are heavily influenced by a few information deemed “salient”. For example, if I asked you: How tall is an African elephant on average, 6 or 10 meters? What would your answer be?
The correct answer is 3.2 meters, but considering that I have given you higher reference numbers you are likely “anchored” to those numbers although they are far greater than reality, this is called Quantitative Anchoring.
Once it is clear that we use heuristics to make decisions, we need to consider how we react when we make mistakes. By observing our behaviours it is clear how difficult it is to learn from mistakes made.
Confirmation bias is a mistake that our mind makes every time we receive data that confirms or not confirms our beliefs. In the first case, the information is allowed to come, in the second case “Comes in through one ear and comes out through the other”. We tend to focus our attention on information that supports our vision and preconceptions and to underestimate information that contradicts it.
Conservatism turns out to be strongly linked to the confirmation error, we tend to keep our judgments and ours unchanged initial assumptions indeed.
Hindsight bias is often known as the “I knew from the start”, how we remember our evaluation change when we become aware of the result of the evaluation itself. The basic idea is that any feedback or exact information that is received after having conducted the first analysis changes the basis of knowledge underlying the original judgment, causing an inclination towards the new information.
The endowment effect consists of a discrepancy between the value that is attributed to an asset in case you have it and the evaluation you give of the same asset in case it is to be purchased. Experimental evidence shows that individuals tend to value more an asset that is already a part of their endowment, neglecting the cost opportunity (equal to the cost that should be incurred to buy it).
Cognitive dissonance is the mental conflict that individuals experience when they are faced with the evidence that one of their beliefs is wrong. The simplest case in which a dissonance occurs is after a decision has been made between multiple alternatives, for or example the purchase of a car. This leads us to seek further confirmation and minimize and distort the dissonant elements.
Self-serving bias is an assumption that good things happen to us when we’ve done all the right things, but bad things happen to us because of circumstances outside our control or things other people purport. This bias results in a tendency to blame outside circumstances for bad situations rather than taking personal responsibility.
Status quo bias refers to the preference to keep things in their current state, while regarding any type of change as a loss. This bias results in the difficulty to process or accept change.
In the end
The combined effect of the different biases, which are way more than those mentioned above, and the tendency to use the shortcuts decisional, is the so-called overconfidence.
This was proven also in the Dunning–Kruger effect, which is a hypothetical cognitive bias due to social psychologists David Dunning and Justin Kruger, stating that people with low ability at a task overestimate their ability.
How to survive overconfidence and biases?
Don’t worry… While you are reading this article you are taking a step forward. Identifying the biases you experience and purport in your everyday interactions is the first step to understanding how our mental processes work, which can help us make better and more informed decisions. Critical thinking is the enemy of bias, so challenge your own beliefs and try a blind approach!
This blog contribution was made by Eleonora Papini.
Eleonora is originally from Italy, she is passionate about human psychology, sustainable development and international cooperation. Eleonora works as a Project Implementation Officer in a European project about urban sustainable development solutions aimed at valuing the young and female entrepreneurship industry.
In 2021 became also a Data Analyst for the LMF Network and content creator for their blog.
How can you keep in touch?
What is LMFnetwork?
The LMFnetwork is a global social enterprise (not for profit) focused on empowering, enabling & educating womxn and marginalised groups into tech, entrepreneurship & digital. We specialise in designing and delivering accessible programmes and supporting a global community. We’ve gone from a brunch club to a social good brand based on what the community wanted. We are a real community run by real people.
How to Identify Cognitive Bias: 12 Examples of Cognitive Bias, Written by MasterClass.https://www.masterclass.com/articles/how-to-identify-cognitive-bias#12-examples-of-cognitive-bias
Heuristics. Written by Psychology Today.https://www.psychologytoday.com/us/basics/heuristics
Cognitive Biases. Written by The decision Lab. https://thedecisionlab.com/biases/