Can we really be free of biases and prejudices? The truth about what steers our perceptions and ideas.
Psychological research linking knowledge on the human cognition to topics such as organisations, leadership, and people processes is on the rise. Among other things, this has revealed of series of implications for decision-making in an organisational context. As questions such as “How can we ensure diversity? How is fairness in the recruitment process guaranteed? And how do we escape our own preconceptions in decision making?” are becoming more salient, the cry for objectivity and neutrality in the decision-making process is getting equally louder.
At Hudson Nordic, we believe that to create lasting change, you need to understand the fundamentals. If we are to talk about objectivity, evidence-based strategies, and rational decision making, we need first to establish why this is even an issue. Why is objectivity so hard to crack?
The answer lies deep within the realms of the human cognition, which is why we are starting this journey towards objectivity at a fundamental level, i.e. the human perceptive system.
Perceiving the World
Every single second, we as humans are bombarded with millions of sensory stimuli from the world around us. The most striking stimuli are the visual and auditive ones (things we see or hear). Still, sensory stimuli can also be tactile, olfactory, and gustatory (things you touch, smell, or taste).
However, through millions of years, our brains did not evolve to become master perceivers of everything in the external environment. Rather, our brains evolved in alignment with one criterion; to keep us alive. And one of the ways this is done is by constantly filtering out information. If this sounds counterproductive, try to consider what would happen if our brains were forced to take in all available stimuli; there would not be any capacity left to engage with the surroundings and, ultimately, survive. Instead, our brains take in the most urgent information and discard the rest, i.e. what is deemed irrelevant for the organism’s immediate survival.
In cognitive psychology, this filtering mechanism is referred to as perceptive selection. Perceptive selection is a useful mechanism to have, but it also has some challenges. In 1999, this was amusingly demonstrated by psychologists and Nobel Prize takers, Simons & Chabris, which we recommend checking out. Just follow this link: https://www.youtube.com/watch?v=vJG698U2Mvo
The experiment shows that if we get asked to solve a specific task, we tend to zoom into the task and disregard most of the other things going on around us. The conclusion is clear; often, we simply do not realise what is right in front of us.
As we have seen, our brains are constantly trying to simplify the world by filtering out incoming information. Based on this, it should be obvious how we are inherently flawed in our ability to see the world “objectively”. If something can be right in front of us and we do not see it (refer to Simons & Chabris’ study), how many other things are we then failing to notice? And how might this selective attention span make us similarly vulnerable to overlooking crucial information when scanning organisational structures and making decisions?
The brain does not stop with just filtering out information and leaving us blind to should-be obvious parameters in our visual field. Rather, the brain fiddles with the information that actually makes it through our perceptive selection process.
Our brain can be compared to a laptop’s hard drive that will let us know if a file cannot be uploaded if the file is too big or will take too long to upload, and we are asked if we would instead like to compress the file. This is a good analogy for what is going on inside our brains as they try to tackle the sensory overload of the outside world. While a brain has an almost unlimited amount of storage space (the long-term memory), the pathway to this is hindered by the limitations of our cognitive capacities.
In situations, where a brain is met with information overload, and because it is a brain and not a computer, it does not ask you if the information should instead be compressed; the cognitive system makes an executive decision and compresses the received information by default.
The information is not just randomly compressed, though. It is made to fit into our pre-existing ideas about the world, retrieved from previous experiences. These ideas, called schemata, lay out the foundation for how we interpret new objects, events, and other people. Put a bit more radically, most of the information reaching our cognition is made to fit inside a box that already has a name and an emotion attached to it; it is deliberately shaped in accordance with our own, personal set of schemata.
Schemata are, in themselves, neither good nor bad. In fact, they are there to serve us. They guide us through life by making the world seem simpler than it is. Among other things, schemata are the reason that you can assign particular objects to broader categories, even if you have never encountered them before. They are the reason that we know how to act when we go shopping somewhere for the first time; and the reason that we know, what constitutes appropriate behaviour when meeting new people. By allowing us to draw inferences about novel situations based on past experiences and interactions, schemata make the world seem familiar and predictable.
However, like perceptive selection, schemata come with a price; they facilitate the occurrence of so-called cognitive biases. Cognitive biases are specific and often unconscious notions about the world when schemata are too heavily relied upon. In other words, when incoming information is not perceived in its pure form, it is rather interpreted through the lens of an established set of beliefs.
For example, in recruitment situations, Halo bias is a very common type of bias; Halo describes a tendency to let one positive characteristic overshadow all other characteristics in the evaluation of a person. Another, common bias in this context is Confirmation bias; a tendency to seek out and only trust information that confirms our existing beliefs. These types of biases are explained more in detail in upcoming articles in this series.
For now, it should be enough to note that making decisions based on biased ideas about someone, instead of assessing all available facts objectively, is undesirable. First and foremost to the individual, who is subject to the bias-based decisions. Secondly, it is also counter-productive to an organisation, as it might end up missing out on crucial talents or new market opportunities due to unchecked decision-making.
Leadership and Objectivity in Decision-making
Here at Hudson Nordic, we know that merely being aware our cognitive shortcomings will not make us immune to them. However, we also know that with awareness comes responsibility, i.e. the responsibility to act in accordance with our best beliefs and apply the most relevant solutions to organisational challenges. Thus, we argue that for anyone striving to be fair and objective in their decision-making, realizing our own cognitive limitations is a crucial first step to take.
While it may be immediately satisfying to have our surroundings affirm our existing beliefs, which is quite easy, because we construct our reality, it is not the way to challenge the status quo. It is quite the opposite, because how can we hope to adapt our core competencies to an increasingly unpredictable market, attract the right talent before our competitors, and/or identify high potentials if we just ride along on a wave of automatized thought patterns? How can we hope to be ahead of the corporate game tomorrow, if we are not seriously considering the intrinsic limitations of our own judgement? If we want to be responsible leaders, we should be ready to accept that much will be invisible and/or adjusted to fit our existing beliefs – and based on this, we should act accordingly.
Cognitive Psychology in Organisations is an article series published exclusively on the Hudson Inspirations platform. Keep checking in for the next parts of the article series where different types of cognitive biases will be explained and linked to the concept of heuristics in decision-making.
Further reading on this topic
The Invisible Gorilla: And Other Ways Our Intuitions Deceive Us. Chabris & Simons (2010)
Thinking, Fast and Slow. Daniel Kahneman (2011)
Never Go with Your Gut: How Pioneering Leaders Make the Best Decisions and Avoid Business Disasters (Avoid Terrible Advice, Cognitive Biases, and Poor Decisions). Gleb Tsipursky (2019)