Last month I wrote about how not to create a safe space which explored the challenges and opportunities of inclusivity and diversity. This got me wondering, why are humans so susceptible — personally and socially — to prejudice, superstition and irrational beliefs about the world, ourselves, different genders, cultures and just about everything else?
The simple and blunt answer is that we are experts at perceptual distortion, inaccurate judgment, and illogical interpretation because of the limitations and evolutionary quirks of our lovely brain architecture. We are subjective, not objective creatures, that get through the day by relying on unconscious mental processes and auto-pilot patterns of thinking.
Hold on a second — I like to think that I’m pretty aware of how and why I decide and behave the way I do! But if you ask a cognitive psychologist — they would say that we are actually unaware of the vast majority of events going on inside our brains that influence our decisions and tendencies. We all have biases operating that are subliminal, surreptitious, and — just to make matters more complicated — quite often in conflict with what our conscious rhetoric, values and beliefs are.
This predicament makes me think of a monologue by “the Writer” character in Andrei Tarkovsky’s famous 1979 soviet sci-fi film Stalker. He philosophises, “How do I know if I don’t actually want what I want? Or I actually don’t want what I want? My conscience wants vegetarianism to win over the world. And my subconscious is yearning for a juicy piece of meat. But what do I want?”
This is how someone who would never consciously intend to act in a racist or sexist way, may actually have race and gender biases affect their behaviour, below their conscious awareness. Unconscious social biases “sneak in through the backdoor of our conscience, our good-personhood, and our highest rational convictions, and lodge themselves between us and the world, between our imperfect humanity and our aspirational selves, between who we believe we are and how we behave” writes Maria Popova.
NPR science correspondent Shankar Vedantam from the Hidden Brain Podcastexplains that these unconscious biases have always caused problems, but “multiple factors have made them especially dangerous today. Globalization and technology, and the intersecting fault lines of religious extremism, economic upheaval, demographic change, and mass migration have amplified the effects of hidden biases. Our mental errors once affected only ourselves and those in our vicinity. Today, they affect people in distant lands and generations yet unborn.”
So, as Shankar Vedantam writes in his book The Hidden Brain, if we all have biases operating surreptitiously “beneath the rim of awareness resulting in everything from financial errors based on misjudging risk to voter manipulation to protracted conflicts between people, nations, and groups” is there anything we can do about them?
The good news is that there is a wealth of ideas and techniques currently being tested, researched and applied to mitigate cognitive bias, unconscious bias and false beliefs.
A famous technique used to untangle assumptions and get a more accurate picture of cause and effect is the 5 Whys that Toyota pioneered back in the 1970’s. This of course has its limitations, but is still very useful.
Another popular business approach to improving inclusivity and enabling diversity in the workplace is structural and social debiasing. Earlier this month had the pleasure of working with Dr. Jennifer Whelan fromPsynapse Psychometrics — a consultancy who do unconscious cognition assessment, advisory & training. I enjoy their monthly newsletter with useful information bites on things like disrupting habits and the cognitive stumbling blocks we are all prone to.
The 2002 Nobel Prize winning economic scientist Daniel Kahneman and author of Thinking Fast and Slow believes that two ways to combat cognitive bias and our irrational propensity to jump to faulty conclusions, is to slow down and use logic and reasoning to think our way to better conclusions. He also suggests that sometimes we need to take decision making out of our hands, and instead use metrics, statistics and analysis.
Self-awareness is another essential approach. Despite the hardwired shortcuts in our brains, we do have the ability to become aware of cognitive biases, which is the first step if we are to learn to fix them. Once you learn about the cognitive biases that exist, it is easier to see them in yourself and others — but a word of caution — even when we recognize the impact that bias has, most people personally think they are less biased than other people — this is a self-deception called bias blind spot.
Getting external feedback from people outside of your social group is another useful technique. Despite the fact that I really don’t like seeing people’s emotional labour used to sell beer — this socially aware Heineken commercial “Worlds Apart” is actually a good example of how one-on-one dialogue between people in a mature and responsible way can help to shift unconscious and cultural bias.
That said. It is really important to remember that challenging all these things can be very confronting. For example, when teaching people about unconscious bias, or talking about diversity efforts more broadly, is very common for high-status or majority group members to become defensive.
Similarly, when you are disentangling false beliefs, cognitive dissonance is a common experience (a term for the mental stress experienced when you simultaneously hold two conflicting ideas). Our brains jump to find ways we can reduce the discomfort, which explains why — if you believe something and are presented with evidence to the contrary, often you will cling more tightly to your original belief.
Sometimes when we encounter information that suggests something we perceive or believe is actually incorrect — it’s not a big deal to change our minds based on the evidence. But other times our distorted beliefs can seem impossible to shift.
Julie Beck explains this phenomena in why facts alone can’t fight false beliefs, “there are facts, and there are beliefs, and there are things you want so badly to believe that they become as facts to you… If the thing you might be wrong about is a belief that’s deeply tied to your identity or worldview — the guru you’ve dedicated your life to is accused of some terrible things, the cigarettes you’re addicted to can kill you — well, then people become logical Simone Bileses, doing all the mental gymnastics it takes to remain convinced that they’re right.”
“This doubling down in the face of conflicting evidence is a way of reducing the discomfort of dissonance, and is part of a set of behaviors known as “motivated reasoning.” Motivated reasoning is how people convince themselves or remain convinced of what they want to believe — they seek out agreeable information and learn it more easily; and they avoid, ignore, devalue, forget, or argue against information that contradicts their beliefs.”
It can be a slow process shifting our biases and beliefs, but staying humble is a useful approach. None of us are immune. Shankar Vedantam says it well, “good people are not those who lack flaws, the brave are not those who feel no fear, and the generous are not those who never feel selfish. Extraordinary people are not extraordinary because they are invulnerable to unconscious biases. They are extraordinary because they choose to do something about it.”