Human Nature #5: Consistancy

Peter Bevelin, in his book Seeking Wisdom, argues that:

Once we’ve made a commitment – a promise, a choice, taken a stand, invested time, money or effort – we want to remain consistent. We want to feel that we’ve made the right decision. And the more we have invested in our behavior the harder it is to change.

For most people, there is something inherently uncomfortable about being wrong. We don’t want to look weak or lose face, so we cling to any information that supports our position. When we can’t find any, we rationalize our mistakes away. Being proven wrong doesn’t have to be uncomfortable. It just means we are a little today than we were yesterday. Mistakes are something to be learned from, not ignored. Keynes put it best when he said:

When somebody persuades me that I am wrong, I change my mind. What do you do?

This post will explore our bias towards consistency and its many forms.

Confirmation Bias

Warren Buffett once said:

What the human being is best at doing is interpreting all new information so that their prior conclusions remain intact.

The term “confirmation bias” was coined by English psychologist Peter Wason. Wason performed an experiment where he asked participants to identify a rule that applied to a list of three numbers. They were given a starting list, and then asked to come up with their own lists. If their new list confirmed to the rule, they were told so. The rule was simply “any ascending sequence”, but the participants came up with vastly complicated rules. To test these rules, the participants would always give a list that would confirm their hypothesis. They never gave a list that would disprove their hypothesis. Wason interpreted this result as showing a preference for confirmation over falsification.

Today, confirmation bias refers to our tendency in favoring information that confirms our preexisting beliefs. For example, a person who is a republican might only seek out information from right leaning sources and vice versa with a democrat. Both sides tend to put more weight on information that supports their beliefs.

Confirmation bias can lead to increased polarization, a preference for early information, and the persistence of long since discredited beliefs. We would do better by being self-critical and able to change our minds as evidence suggests. When we find information that contradicts our exiting beliefs, we should look at it with unbiased eyes.

Sunk cost fallacy

Why do we tend to hang on to an unhappy relationship, a losing investment, an unjustifiable war, or a failed strategy? Peter Bevelin puts it this way:

The more time, money, effort, or pain we invest, the more we feel the need to continue, and the more highly we value something – whether or not it is right.

Today, this concept is known as the sunk cost fallacy. A sunk cost is a cost that has already been incurred and cannot be recovered. For example, say we bought a ticket to a concert for 100 dollars but we no longer want to go. We feel our time would be better spent doing something else. However, we remember the $100 cost and decide that it would be a waste not to go. Or imagine we did go but the concert turned out to be horrible. We sit through the whole thing anyway, since we paid good money to be there.

In both cases, the money is already lost forever. The real choice is between doing something you want to do and doing something you don’t want to do. One of the most powerful examples of sunk cost thinking was the Vietnam War. Psychologist Allan Teger puts it this way:

The longer the war continued, the more difficult it was to justify the additional investments in terms of the value of possible victory. On the other hand, the longer the war continued, the more difficult it became to write off the tremendous losses without having anything to show for them.

This way of thinking has serious pitfalls and we would be best served by avoiding it. Warren Buffett once said:

The most important thing to do when you find yourself in a hole is to stop digging.

Making a mistake in the past, doesn’t mean we need to continue that mistake in the present. Instead, we should ask where we want to be going forward, and accept that any invested time or money are gone forever either way.

The man with a hammer

To the man with a hammer, everything looks like a nail. We tend to see the world through the lens of what we know. If we spend years getting a PhD in finance, learning advanced mathematical theories, we will want to use those theories. The more time we put into learning those theories, the more we become an evangelist for them. John Kenneth Galbraith once said:

Economists are most economical about ideas. They make the ones they learned in graduate school last a lifetime.

Max Planck put it his way:

A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die and a new generation grows up that is familiar with it.

It is hard to kill ideas that people are heavily invested in.

The soldier or the scout

Julia Galef has a great Ted talk on this subject:

Galef argues that there are two mindsets people fall into. The soldier or the scout. The soldier’s job is to attack or defend. The scout’s job is to observe the environment as accurately as possible. Making good decisions is mostly about which mindset we are in.

Galef gives the example of the Dreyfus affair. The Dreyfus affair began in 1896, when Captain Alfred Dreyfus, a young Jewish officer in the French artillery, was convicted of sending military secrets to the Germans. The evidence against Dreyfus was none existent, but The investigators just argued Dreyfus was such a good spy that he hid all the evidence. They noted he studied foreign languages in school, which obviously meant he would become a spy later in life. Despite having no real evidence, the investigators truly believed they had a strong case against Dreyfus.

Galef argues that the investigators were in a solider mindset. Dreyfus was Jewish at a time when antisemitism was rampant in the army. The investigators felt the need to defend ideas that supported their antisemitism. This is known as motivated reasoning.

Dreyfus would later be exonerated by evidence uncovered by Georges Picquart. High ranking officials attempted to suppress this evidence and even had Picquart imprisoned. Eventually the truth came out, and the whole affair became a universal symbol of injustice. Picquart was a noted antisemitic also, so why was he able to uncover the truth? Maybe his motivation to find the truth trumped his other biases. The drive to find the truth, even when it is inconvenient, is the hallmark of a scout mindset.

Galef closes with the following:

We need to learn how to feel proud, instead of ashamed, when we notice that we might have been wrong about something. We need to learn how to feel intrigued, instead of defensive, when we encounter some information that contradicts our beliefs. So, what do you most yearn for? To defend your own beliefs, or to see the world as clearly as you possibly can.

As it relates to investing

We often avoid selling a losing position because we feel the need to prove to the world that we weren’t wrong. We rationalize why the market is doing what it is doing. The sheer act of owing something makes us more defensive of it.

This leads us to ignore counterfactual evidence and to be overconfident. To counteract this, we should always attempt to understand the argument against our thesis. We should always invert. It is important that we be able to cut ties with an idea as soon as we are proven wrong. Charlie Munger puts it this way:

I never allow myself to have an opinion on anything that I don’t know the other side’s argument better than they do.

Conclusion

Our discomfort with being wrong causes us to reason in ways that reduce that discomfort. If we want to make the best decisions possible, we need to come to terms with this bias. We need to learn to embrace being wrong. We need to actively seek out evidence that challenges our beliefs. Joe Klaas put it best:

The truth will set you free, but first it will piss you off.

References:
https://www.amazon.com/Seeking-Wisdom-Darwin-Munger-3rd/dp/1578644283
https://en.wikipedia.org/wiki/Confirmation_bias
https://en.wikipedia.org/wiki/Sunk_costs
https://en.wikipedia.org/wiki/Dreyfus_affair
http://www.ted.com/talks/julia_galef_why_you_think_you_re_right_even_if_you_re_wrong

Leave a comment