A guy looking skeptically to the reader
A guy looking skeptically to the reader
(featured image: Johnny Worthington CC BY)

I don’t care what you think

How we think matters more than what we think and who we are

From the moment we’re born, to understand the world, we rely on the judgement of others. Our parents, our siblings, our friends, our teachers, our colleagues, our boss, our political leaders, the media, social media, and so on — they all help us figure out how things hang together. There are two main reasons for this: others know facts we don’t know, and they may also be better than we are at transforming those facts into a judgement.

But not everyone’s judgement is equally reliable. Their own factual knowledge may be inaccurate, or their competence in interpreting the facts may be imperfect. We learn to understand this, and we learn to trust (and indeed to mistrust) some people more than others. But on what basis do we form those insights?

It is hard to directly establish the quality of someone’s judgement. Who they are often influences how we rate what they say.

If we know something about a person, we may use this information to evaluate how much we should trust their judgement. If that has proved reliable in the past, then perhaps we can trust them here too. But it is not because someone’s advice on how to get rid of moss in our lawn turned out spot on (they are a keen gardener), that their viewpoint on migration or climate change is reliable too. It is not because someone is an eminent epidemiologist that their judgement on whether or not schools should be open during a raging pandemic is particularly dependable. Deep epidemiological insight is necessary, but not sufficient to make that call.

A mechanic with crossed arms, looking into the camera
A mechanic with crossed arms, looking into the camera
“Nope, I’m pretty sure you don’t have a brain tumour.” (image: Aaron Norcott via Unsplash)

We may also take other people’s identity into account when evaluating their trustworthiness. And that is not necessarily a bad thing to do: it is quite reasonable to consider a judgement that our persistent headaches are not caused by a brain tumour as more reliable if it comes from a neurologist, than if it comes from our car mechanic, or from a random bloke in the pub. But a popular singer or a celebrated actor is not necessarily more qualified than we are to judge whether the prime minister or the president of our country is doing a good job, or whether privatizing the health service is a good thing.

If a person’s identity is linked with whether they belong to your ingroup or not, such a preconception is even more dubious. Unless she has specific competence on, say, the efficacy of face masks, the judgement of your close colleague, your cousin, your best friend, the chairman of the local chapter of the political party you belong to, or indeed the PM or president of your country is no more trustworthy than mine.

The credibility of someone’s judgement on a subject is not correlated with their competence in an unrelated area, nor with how handsome they look, how well they’re dressed, how they speak, or whether we feel any affinity to them. If we let our evaluation of their judgement be influenced by these things, we are in the spell of the halo effect or its opposite, the horn effect (where we dismiss someone’s judgement on climate change because they are not good-looking, have different political views, or are a terrible cook).

You will probably have heard of many of the cognitive biases that can cloud our (and others’) judgement — Wikipedia has a list with of over 200 of them. But there is one that Is not on the list, and that is arguably more pernicious than all the others: the myside bias — the tendency to see, and evaluate things from a perspective that favours our existing beliefs and opinions.

The concept has been extensively studied by Keith Stanovich, a psychologist at the university of Toronto. He defines it as “a subclass of confirmation bias” (Wikipedia refers to it as an alternative name for confirmation bias). However, I am inclined to see it as higher-level tendency, to which confirmation bias, selection bias, hindsight bias, motivated reasoning etc all contribute.

Myside bias affects not only how we interpret other people’s judgement and behaviour, but also how we form our own judgement. We think someone’s judgement is more reliable if it this person is someone we agree with, or indeed someone we simply like for whatever reason. We tend to rate what someone says based on who they are, irrespective of their qualification (this is in effect a misplaced appeal to authority).

It doesn’t even have to be a person, ideology can play that role too. In a classic set of studies from 2003, Geoffrey Cohen, a psychologist (then at Yale university but now at Stanford), investigated how people’s party affiliation influenced their support for particular policies. He confirmed that liberal individuals prefer more generous welfare policies, and conservative individuals prefer more stringent ones. But if a generous policy was presented as being supported by 95% of House Republicans and only 10% of House Democrats, conservative participants would agree with it, even though it was a policy they would otherwise have opposed. The same was true for liberal participants if a stringent policy was presented as supported by a vast majority of Democrats and only a slim fraction of Republicans. The unsurprising title of Cohen’s paper? “Party over policy”.

Myside bias is not inherently detrimental: Bayesian reasoning requires us to start from our priors, our initial belief, and update it based on new evidence. As long as our priors are evidence-based, and not aspirations and things we want to be true, we are OK. But when our priors are unsupported beliefs or preferences, myside bias is very definitely a problem.

Reasoning can be distorted by myside bias. If we are motivated by our convictions, and by what we want to be true, our conclusions are unreliable — and that applies to the people on whose judgement we depend, just as much as to ourselves.

Two dogs fighting over a piece of meat
Two dogs fighting over a piece of meat
Whose side are you on? It shouldn’t matter when you’re reasoning (image: Jerome Olivier CC BY)]

This can be countered by treating reasoning like algebra. We can take the specifics out of how we analyse and process facts to come to a conclusion, and replace particular individuals, events, groups, consequences with parameters a, b and so on. What if it was someone who belongs to our ingroup who said x rather than an outgroup member? What if y happened as a result of an action we condemn, rather than one we approve of? It should make no difference.

Sound reasoning, like algebra, does not depend on the nature of the specifics. A straight line described by y = ax + b will intersect the X-axis where x=-b/a and the Y-axis where y=b, whatever the value of a and b. A policy is commendable or objectionable because of what it does, whoever supports it. We should be critical of the evidence we evaluate, whether or not it supports what we want to be true. The way a political leader behaves is respectable (or not), regardless of our political convictions.

This doesn’t mean our preferences, our desires, our aspirations and our beliefs don’t matter. But it does mean that none of these should influence how we think.

And that is why I don’t care what you think. We may agree or disagree, and that is perfectly fine. In fact, it’s more than fine: if we disagree, my own thinking is challenged and I may be inspired to reconsider what I think and indeed learn something and change my mind… at least if I can see how you came to your view. Like on an exam, I need to see not just your solution, but your workings.

Because while I don’t care about what you think, I care very much about how you think.

Originally published at http://koenfucius.wordpress.com on October 2, 2020.

Thanks for reading this article — I hope you enjoyed it. Please do share it far and wide — there are handy Twitter and Facebook buttons nearby, and you can click here to share it via LinkedIn, or simply copy and paste this link. See all my other articles featuring observations of human behaviour (I publish one every Friday) here. Thank you!

Written by

Accidental behavioural economist in search of wisdom. Uses insights from (behavioural) economics in organization development. On Twitter as @koenfucius

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store