Quick thinking leads to bad behavior

Alex Bryan, Morningstar’s director of passive strategies for North America, shares key insights from Daniel Kahneman’s ‘Thinking Fast and Slow’

Alex Bryan 1 March, 2019 | 6:00PM
Facebook Twitter LinkedIn

Humans are not wired to be good investors. To survive in the wild, we developed a tendency to rely on mental shortcuts to process information quickly, which is helpful for avoiding danger and taking advantage of fleeting opportunities. But this can be a hindrance to the unnatural act of investing.

Psychologist and Nobel laureate Daniel Kahneman explores how people process information and the biases that mental shortcuts can create in his book, “Thinking Fast and Slow.” While the book isn’t specifically about investing, it explains why we often do dumb things as investors, and is well worth a read. Here, I’ll share some of the key insights from the book and how we can learn from them to become better investors.

A Two-Track Mind

The book’s title refers to two distinct thought-processing systems that everyone has, according to Kahneman: System 1 is the fast track that operates automatically and guides most of what we do. It is intuitive and processes information quickly, with little or no voluntary control, allowing for fast decision-making. The first answer that pops into your head when posed with a difficult question, such as how much money you should save for retirement, is probably from System 1.

System 2 is the slow track. It manages complex problem-solving and other mental tasks that require concentrated effort. This system is deliberate, cautious, and…lazy. Most of the time, we rely on System 1 and engage System 2 only when something surprising occurs or when we make a conscious effort. This division of labor allows us to make the most of our limited attention and mental resources, and it usually works well.

But the shortcuts that System 1 takes can lead to poor investment decisions. Among these, it suppresses doubt, often substituting an easier ques¬tion for the one that was asked. Also, it evaluates claims not by the quality or quantity of evidence but by the coherence of a story it can create to fit the evidence. To top it off, this system has little under¬standing of statistics or logic. However, it’s hard to recognize those shortcomings in the moment or even to be aware that System 1 is guiding our actions.

Shortcuts That Can Lead to Trouble

Confirmation Bias: The world is full of trade-offs, nuance, and complexity. But that’s not conducive to quick thinking. So System 1 paints a picture of the world that is simpler than reality, suppressing doubt. Doubt requires greater effort than belief because it is necessary to reconcile conflicting information. We are biased toward adopting views that are consistent with our prior beliefs or seeking out evidence that supports those beliefs--hence, confirmation bias. This is not only because System 2 is lazy, but also because we have evolved a preference for the comfortable and familiar. In the wild, familiarity is a sign that it is safe to approach; those who aren’t cautious of the unknown are more susceptible to danger and less likely to survive.

Yet confirmation bias can lead to bad investment decisions because it can breed overconfidence. For example, it may cause bullish investors to overlook or underweight good reasons not to buy. It may cause others to hold on to bad investments longer than they should, anchoring to their original investment thesis and increasingly attractive valuations while ignoring signs that the fundamentals have deteriorated.

To overcome this potential bias and reduce blind spots, it’s helpful to seek out information that conflicts with what you already believe. If you’re convinced that active management is futile, check out what active shops have to say on the subject. If you’re about to pull money out of stocks because you’re worried about a recession, consider reading what a more bullish commentator has to say first. There’s almost always a good case that could be made for and against any investment, and for every buyer, there’s a willing seller. Before trading, ask yourself, ‘What does the other person know that I don’t?’ This practice can facilitate a richer understanding of your investments and may reduce the likelihood of overlooking important information.

Question Substitution: It’s amazing how many tough questions we can answer with little hesitation. Let’s return to the question about how much you should save for retirement. To seriously answer that, it’s necessary to estimate future expenses in retirement, expected returns on your investments, life expectancy, and future income flows. That’s all hard System 2 work that few undertake. Instead, most answer an easier question like, ‘How much can I save for retirement?’ This type of substitution is one of the ways that System 1 enables quick decision-making. It often yields a good-enough answer for the original question--but not always. More troubling, we aren’t always aware of this substitution when it takes place.

Investors commonly substitute easy questions (How has this fund manager performed?) for hard ones (Is this manager skilled?). Performance is the result of luck, skill, and risk-taking. The best-performing managers often owe a lot of their success to luck, which is fickle. Past performance alone (over periods longer than a year) is a terrible predictor of future performance. At best, it’s a very noisy proxy for skill. Because the quality of that signal is low, we shouldn’t put much stock in it.

To avoid falling prey to this type of subconscious substitution when the stakes are high, ask yourself, ‘What do I need to know to answer the question that was asked?’ ‘How would I know if I’m wrong?’ If it’s hard to tell, proceed with caution.

“What You See Is All There Is.”: This is a phrase Kahneman frequently uses in his book to describe System 1’s tendency to focus solely on the information at hand, failing to account for relevant information that might be missing. It attempts to construct a story with that information to make sense of it and judges its validity based on the coherence of that story rather than on the quality of the supporting evidence.

For example, consider a poor-performing mutual fund that replaces its manager with a star who has a long record of success running a different portfolio at the firm. Three years later, it is among the best-performing funds in its category. That story seems to support the obvious conclusion that the new manager turned things around. But there’s not enough information to know that.

System 1 doesn’t stop to consider alternatives that weren’t presented. For instance, the performance turnaround may not have come from any changes to the portfolio. Rather, after a stretch of underperformance, the holdings the original manager selected may have been priced to deliver better returns going forward. Luck could have also played a role. However, the stories that System 1 creates leave little room for chance, often searching for causal relationships where there are none.

We can do better by asking what other information might be relevant and how reliable the source is, and accounting for chance. The best way to do that is to start with the base rates of the group of which the specific case is a part and adjust from there based on the dependability of the information available about the case.

Availability Bias: This is the tendency to judge the likelihood of an event based on how easily it is to think of examples. This bias is an extension of System 1’s focus on the information at hand and eagerness to answer an easier question. It suggests that perceptions of risk are influenced by recent experiences with losses. After a market downturn, the market feels riskier than it does after a long rally and memories of past losses are more distant. This may even cause the market to sell off more than it should during market downturns, as perceptions of risk change more statistical measures of risk. While it’s possible to fight this bias by becoming better informed about the relevant statistics, it’s difficult to overcome. Examples that resonate with us, particularly personal experiences, ring louder than abstract statistics. The best we can do is to recognize that innate bias and educate ourselves.

A version of this article was published in the December 2018 issue of Morningstar ETFInvestor.

Facebook Twitter LinkedIn

About Author

Alex Bryan

Alex Bryan  Alex Bryan, CFA, is director of passive strategies for North America at Morningstar. Before assuming his current role in 2016, he spent four years as an analyst covering equity strategies. He holds an MBA with high honors from the University of Chicago Booth School of Business.

© Copyright 2024 Morningstar, Inc. All rights reserved.

Terms of Use        Privacy Policy       Disclosures        Accessibility