Decision Making on Freeways and in Parking Lots
Many drivers here in California adhere to the common freeway speed limit of 65 miles per hour, while some do not (I’ll take the 5th). In the vast majority of cases, racing to your destination at these faster speeds makes perfect sense. However, driving 65 mph through the shopping mall parking lot could get you killed, so slower driving is preferred in this instance. Ultimately, the specific environment and situation will dictate the rational and prudent driving speed. Decision making works in much the same way, and Daniel Kahneman, a Nobel Prize winner, has encapsulated his decades of research in psychology and economics in his most recent book, Thinking, Fast and Slow.
Much of Kahneman’s big ideas are analyzed through the lenses of “System 1” and “System 2” – the fast and slow decision-making processes persistently used by our brains. System 1 thinking is our intuition in the fast lane, continually making judgments in real-time. Our System 1 hunches are often correct, but because of speedy, inherent biases and periodic errors this process can cause us to miss an off-ramp or even cause a conclusion collision. System 2, on the other hand, is the slower, methodical decision-making process in our brains that keeps our hasty System 1 process in check. Although little mental energy is exerted by using System 1, a great deal of cerebral horsepower is required to use System 2.
Summarizing 512 pages of Kahneman’s book in a single article may be challenging, nevertheless I will do my best to summarize some of the interesting highlights and anecdotes. A multitude of Kahneman’s research is reviewed, but a key goal of the book is designed to help individuals identify errors in judgment and biases, in order to lower the prevalence of mental mistakes in the future.
Over Kahneman’s 50+ year academic career, he has uncovered an endless string of flaws in the human thought process. To bring those mistakes to life, he uses several mind experiments to illustrate them. Here are a few:
Buying Baseball: We’ll start off with a simple Kahneman problem. If a baseball bat and a ball cost a total of $1.10, and the bat costs $1 more than the ball, then how much does the ball cost? The answer is $0.10, right? WRONG! Intuition and the rash System 1 forces most people to answer $0.10 cents for the ball, but after going through the math it becomes clear that this gut answer is wrong. If the ball is $0.10 and the bat is $1 more, then that would mean the bat costs $1.10, making the total $1.20…WRONG! This is clearly a System 2 problem, which requires the brain to see a $0.05 ball plus $1.05 bat equals $1.10…CORRECT!
The Invisible Gorilla: As Kahneman points out, humans can be blind to the obvious and blind to our blindness. To make this point he references an experiment and book titled Invisible Gorilla, created by Chritopher Chabris and Daniel Simons. In the experiment, three players wearing white outfits pass a basketball around at the same time that a group of players wearing black outfits pass around a separate basketball. The anomaly in the experiment occurs when someone in a full-sized gorilla outfit goes prancing through the scene for nine full seconds. To the surprise of many, about half of the experiment observers do not see the gorilla. In addition, the gorilla-blind observers deny the existence of the large, furry animal when confronted with recorded evidence (see video below).
Green & Red Dice: In this thought experiment, Kahneman describes a group presented with a regular six-sided die with four green sides (G) and two red sides (R), meaning the probability of the die landing on green (G) is is much higher than the probability of landing on red (R). To make the experiment more interesting, the group is provided a cash prize for picking the highest probability scenario out of the following three sequences: 1) R-G-R-R-R; 2) G-R-G-R-R-R; and 3) G-R-R-R-R-R. Although most participants pick sequence #2 because it has the most greens (G) in it, if one looks more closely, sequence #2 is the same as #1 except for sequence #2 has an additional green (G). Therefore, the highest probability winning answer should be sequence #1 because sequence #2 adds an uncertain roll that may or may not land on green (G).
While the previous experiments described some notable human decision-making flaws, here are some more human flaws:
Anchoring Effect: Was Gandhi 114 when he died, or was Gandhi 35 when he died? Depending how the question is asked, asking the initial question first will skew the respondents answer to a higher age, because the respondents answer will be somewhat anchored to the number “114”. Similarly, the price a homebuyer would pay for a house will be influenced or anchored to the asking price. Another word used by some for anchoring is “suggestion”. If a subliminal suggestion is planted, people’s responses can become anchored to that idea.
Overconfidence: We encounter overconfidence in several forms, especially from what Kahneman calls the “Illusion of Pundits,” which is the confidence that comes with 20-20 hindsight experienced in our 24/7 media world. Or as Kahneman states in a different way, “The illusion that we understand the past fosters overconfidence in our ability to predict the future.” Driving is another example of overconfidence – very few people believe they are poor drivers. In fact, a well-known study shows that “90% of drivers believe they are better than average,” despite defying the laws of mathematics.
Risk Aversion: In Kahneman’s book, he also references risk aversion studies by Mathew Rabin and Richard Thaler. What the researchers discovered is that people appear to be irrational in the way they respond to certain risk scenarios. For example, people will turn down the following gambles:
A 50% chance to lose $100 and a 50% chance to win $200;
A 50% chance to lose $200 and a 50% chance to win $20,000 .
Although rational math would indicate these are smart bets to take, however most people decline the game because humans on average weigh losses twice as much as gains (see also the Pleasure/Pain Principle). To get a better understanding of predictive human behavior, the real emotional costs of disappointment and regret need to be accounted for.
Truth Illusions: A reliable way to make people believe in falsehoods is through repetition. More exposure will breed more liking. In addition to normal conversations, these repetitive truth illusions can be witnessed in propaganda or advertising. Minimizing cognitive strain also reinforces points. Using bold, colored, and contrasted language is more convincing. Simpler language rather than more complex language is also more credible.
Narrative Fallacies: We humans have an innate desire to continually explain the causation of an event due to skill or stupidity – even if randomness is the best explanation.People try to make sense of the world, even though many outcomes have no straightforward explanation. Often times, a statistical phenomenon like “regression to the mean” can explain the results (i.e., outliers revert directionally toward averages). The “Sports Illustrated Jinx,” or the claim that a heralded cover story athlete will be subsequently cursed with bad performance, is used as a case in point. Actually, there is no jinx or curse, but often fickle luck disappears and athletic performance reverts to norms.
Kahneman on Stocks
Many of the principles in Kahneman’s book can be applied to the world of stocks and investing too. According to Kahneman, the investing industry has been built on an “illusion of skill,” or the belief that one person has better information than the other person. To make his point, Kahneman references research by Terry Odean, a finance professor at UC Berkely, who studied the records of 10,000 brokerage accounts of individual investors spanning a seven-year period and covering almost 163,000 trades. The net result showed dramatic underperformance by the individual traders and confirmed that stocks sold by the traders consistently did better than the stocks purchased.“Taking a shower and doing nothing” would have been better than the value destroying trading activity. In fact, the most active traders did much worse than those who traded the least. For professional managers the conclusions are not a whole lot different. “For a large majority of fund managers, the selection of stocks is more like rolling dice than like playing poker. Typically at least two out of every three mutual funds underperform the overall market in any given year,” says Kahneman. I don’t disagree, but I do believe, like .300 hitters in baseball, there are a few managers that can consistently outperform.
There are a lot of lessons to be learned from Daniel Kahneman’s book Thinking, Fast and Slow and I apply many of his conclusions to my investment practice at Sidoxia. We all race through decisions every day, but as he repeatedly points out, familiarizing ourselves with these common mental pitfalls, and also utilizing our more methodical and accurate System 2 thought process regularly, can create better decisions. Better decisions not only for our regular lives, but also for our investing lives. It’s perfectly OK to race down the mental freeway at 65 mph (or faster), but don’t forget to slow down occasionally, in order to avoid mental collisions.
Wade W. Slome, CFA, CFP®
Plan. Invest. Prosper.
DISCLOSURE: Sidoxia Capital Management (SCM) and some of its clients hold positions in certain exchange traded funds (ETFs), but at the time of publishing SCM had no direct position in any security referenced in this article. No information accessed through the Investing Caffeine (IC) website constitutes investment, financial, legal, tax or other advice nor is to be relied on in making an investment or other decision. Please read disclosure language on IC Contact page.
Entry filed under: Behavioral Finance, Profiles. Tags: anchoring, Daniel Kahneman, decision making, intuition, Nobel Prize, Odean, overconfidence, Rabin, Richard Thaler, risk aversion, System 1, System 2, Thinking Fast and Slow.