If you turned on the news anytime in 2016, you would likely see some tragedy occurring. Whether it be a terrorist attack in France or a boat of refugees capsizing in the Mediterranean, these images have been powerful and they remain fresh in our memories. We are to be forgiven then, if we start to see the world through a less optimistic lens.
On top of all that, the amount of tragedy that doesn’t get reported on is even greater. But the opposite is also true. There are plenty of positive and uplifting stories that never reach our attention. Reporting that 7 billion people didn’t die today from war or terrorism doesn’t make any sense. Stalin was right (a rarity) when he said:
A single death is a tragedy; a million deaths is a statistic.
The way we focus our attention, or the way our attention is focused for us, creates the reality that we live in. In his book Seeking Wisdom, Peter Bevelin explained how:
We react to stimuli that we personally encounter or that grabs our attention. We react more strongly to the concrete and specific than to the abstract. We overweigh personal experience over vicarious. We see only what we have names for. We tend to focus only on the present information rather than what information may potentially be missing.
This post will explore how attentional biases shape our world and our judgements.
We judge the probability of future events by the ease in which instances of them can be brought to mind. For example, if our neighbors house gets broken into, we might judge that our house is more likely to be broken into. If we read about a successful start-up, we might judge that start-ups in general are more likely to succeed.
In each case, we are using an availability heuristic to make judgements. The availability heuristic is a mental shortcut that relies on immediate examples that come to a given person’s mind when evaluating a topic.
Tversky and Kahneman tested this idea in their famous “K” experiment. They asked participants: if a random word is taken from an English text, would it be more likely that the word started with a K, or that K was the third letter in the word? Participants overestimated the number of words that began with the letter K and underestimated the number of words that had K as the third letter. Tversky and Kahneman concluded that participants answer these kinds of questions by comparing how easily they can recall instances of the two categories. It is much easier to think of a word that starts with K than to think of a word that has K as the third letter.
Inattentional blindness is a psychological lack of attention that is not associated with any vision defects. Simply put, we tend to be blind to the unexpected. This concept is the foundation of most magic tricks. For example, when our attention is drawn in one direction by a magician, we become blind to what is going on elsewhere in our field of vision.
There are many experiments that substantiated this claim. One study showed how experienced pilots, when trying to land a plane in a flight simulator, could be so focused on their flight instruments that they completely missed a second plane blocking the runway. The most famous experiment was the Invisible Gorilla test that went viral a few years back:
Survivorship bias and omission blindness
We tend not to see what isn’t reported or readily available. Missing information just doesn’t draw our attention. For example, we see the person that won the lottery, but we don’t see the millions of people who lost the lottery. We see the successful new restaurant, but we don’t see the thousands of failed restaurants. We see the one outcome of our strategy, but we don’t see the other outcomes that are equally possible.
One of my favorite survivorship bias stories featured Abraham Wald, a World War II statistician. Wald was attempting to minimize aircraft losses by studying the damage done to aircraft that returned from missions. Most researchers, from the Center for Naval Analyses, concluded that armor should be added to the areas that showed the most damage. However, Wald pointed out that the data suffered from survivorship bias and only showed the types of damage an aircraft could suffer and still return home. He concluded that armor should be applied to the areas of the aircraft that showed the least damage.
In relation to investing
Management has an incentive to misdirect our attentions. Enron is the classic case study: they constantly reported great numbers that diverted attentions away from any troubling warning signs.
The media has an incentive to package things as either the end of the world or the beginning of a golden age. Most of this is noise and should be ignored. Instead, we should focus our attention on what the media is omitting or getting wrong. This is where value can be found.
Oftentimes in investing, we back-test how a certain strategy or investment played out. For example, we might look at how the current top 10 hedge funds have performed over the past 10 years. By only looking at the hedge funds around today, we are adding survivorship bias to our data. Instead we should look at the top 10 hedge funds from 10 years ago and see how they performed. We might find that half of them disappeared altogether.
Most importantly, we have to remember how flawed our attention systems are. When we realize that we can be completely blind to a dancing gorilla, it should add some humility to our judgements. Overconfidence has ruined more investors than humility ever will.
Where we place our attention shapes the reality that we live in. Seeing one thing can make us blind to another. Seeing one vivid example can cause us to forget all statistical reasoning. As investors, we need to be aware of these effects and seek to reduce them. Instead of only considering the information that’s presented to us, we should always consider missing information and alternate explanations. What is omitted can often be extremely relevant.