The trouble with unconscious bias

Fine-tuned balancing scales are a beautiful thing. We can find out with great accuracy which of two objects is heavier. We’d like our mind to have just such a device when we have to evaluate the merit of different individuals. Then we could make fair decisions about which grant proposals should be funded, and where prizes should be awarded. But, alas, we have no objective scales to measure differences in merit. Instead we have clunky subjective scales that operate with distinct biases.

https://commons.wikimedia.org/wiki/File:Scales_of_justice_by_Nicholas_Boudreau.jpg

I have written about unconscious bias before, e.g. here linked to a short animation, and here.  In a more lighthearted vein I also developed a cartoon dialogue with the BBC 100Women programme. I still stand behind what I said in 2015, but I also feel that reminders and refreshers are important. Gender bias is still grimly hanging on. I have been trying to keep track of the continuing empirical work on bias and I am convinced that there is more work to be done. Hence the present post, which is paired with another post on my current ideas on quota and the possibility of using a lottery element to achieve truly unbiased selection. 

Are we really unable to select for excellence on meritocratic grounds – pure achievement, as opposed to wealth or position? We like to believe we can, and we have set up thousands of selection panels and prize juries on the basis of this belief. But doubts are creeping in. There seems to be a way for race and gender prejudice to tip the scales behind our backs. Prejudice seems ubiquitous and it seems to start very early in childhood. We like to think this could be an unwanted effect of our environment, something that can be reversed by training. In fact, it is something located deep down in our mind/brain and not easily erased.

Why are our mental scales so prone to bias? Perhaps as far as our ancient social brain is concerned, accuracy isn’t everything. When we interact with other people there are more pressing concerns than discriminating subtle differences in achievement. No one knows how many bits of information have to be sifted to make quick decisions about other people, whether they belong to our group and whether they will cooperate or compete. All we know is that there are shortcuts that make it possible to constantly align with other social agents, if only to be able to fly in the flock.

Getting along with other people is vitally important to us. We automatically assess them with brute self-interest, mostly hidden from our conscious Self. This is why the mental scales are biased and why they let down selection panels. This makes us uneasy because our considered assessment of individuals is often completely at odds with our crude automatic assessment. The unconscious part of our social brain is trigger happy in recognising in-groups and out-groups. Yet our feelings of belonging are fluid and we can belong to many different in-groups. It makes no sense to avoid strangers because their appearance evokes a threat response in our ancient amygdala, when in fact they might become useful allies. We have learned that we benefit more from cooperation than from conflict, and that oddballs can make outstanding discoveries. We have learned that men can be nurturing and that women can be competitive.

All this learning does not erase the ancient short-cuts that the social brain has acquired over eons of evolution. However, we can agree that work of selection panels improves if we process information about candidates in a slow and deliberate fashion, using the conscious part of our brain. When we need to make accurate judgements about the merits of different individuals when we allocate grants or awards, and this means we need to becalm our highly inflammable social brain.

Here I want to mention just three shortcuts that guide information processing in the unconscious social mind/brain. First, the so-called ‘availability heuristic’. This results in a preference for the familiar, for example for a candidate who is similar to previous successful candidates. Second, there is a desire for affiliation, which means we implicitly favour those who are like us and/or those who belong to a more socially dominant group. Third, we all believe that we are less biased than other people and have better arguments. We also believe that we are less subject to conflict of interest than others. For example, 61% of doctors thought pharmaceutical industry promotions did not affect their prescribing; only 16% believed this to be true for other doctors.

Knowing about biases does not neutralise them. They operate below consciousness and it is very unlikely that we can delete our unconscious short cuts. It might even be dangerous to do so, with unwanted side effects. But there is one way to make to ‘put them in their place’, in a manner of speaking. We can monitor and challenge each other because we see bias more easily in others than in ourselves. This is a bit uncomfortable – because we cannot see the beam in our own eye while we can see ‘the mote in our brother’s eye’. Of course we should not blame each other. It’s human to have unconscious biases.

What can we do to counteract bias? Not through intensive training programmes (see Footnote). However, it is useful to be aware of unconscious biases and selection panels need to be reminded of them as they make their difficult decisions. Slowing down the decision making process allows the conscious part of our brain to reflect and to query the reasons for our rash intuitive judgements. This is best done in groups when we can discuss different reasons. We can never ever be unbiased because this is how the brain works, where strong prior beliefs are affecting our perception and experience. Once we admit that subjective factors play into our judgement we can be more sceptical of our feelings. We can’t help it that our feelings are subtly biased against minority candidates. It’s precisely because there are so few of them. It means they fall outside the norm, always an awkward place to be.

Diversity helps us make better decisions (more on this in the next post). We now know that a diversity of viewpoints helps us to avoid being stuck in a rut. By listening to others’ point of view we can counteract the fact that we tend to be more critical of others’ theories, and uncritical or our own.

Footnote Unintended consequences of conscious efforts to counteract bias.  Brown et al. 2011 argued that people feel licensed to act on bad motives if they feel they have the moral high ground. Affirming one’s egalitarian or pro-social values and virtues subsequently facilitates prejudiced or self-serving behaviour, an effect referred to as “moral credentialing.” In a study people who had ‘credentialed themselves’ were more likely to cheat in a maths test, especially if they could easily rationalise this behaviour. In another study Monin & Miller, 2001 showed that people are more willing to express prejudice when their past behaviour has established their credentials as morally superior non-prejudiced individuals. Here, people were first given the opportunity to disagree with a blatantly sexist statement. Later they were more willing to favour a man over a woman for a stereotypically male job. Other studies confirmed this rebound effect.