Thinking Fast and Slow by Daniel Kahneman

The issues addressed by Nobel Laureate Daniel Kahneman are both complex and fundamental to the human mind. He invites you to think about thinking by considering how your mind frequently confuses itself, distorts data, and misleads you. His style is clear, his reasoning is strong, and his honesty is refreshing. Kahneman uses stories from his own life to demonstrate confused thinking. As a result, thinking fast and slow is a slow read but ultimately rewarding.

Your “Two Systems” and What They Mean

When you need to make sense of something, you think about it. Consider a model in which people use two cognitive systems to understand this process.

The first is “System 1,” or brain processing. It reads emotions and handles instinctive abilities such as driving or adding two and two. When you understand simple sentences (such as “complete the phrase ‘bread and…'”), turn to see where a noise is coming from and System 1 takes over your thinking. System 1 swiftly and automatically provides linked meanings (including stereotypes).

When you’re focusing on specific details, like figuring out how to fill your income tax forms, you use “System 2”. System 2 requires conscious effort, like performing math, attempting new activities, or searching for a specific person in a crowd. System 2 thinking is slower, yet it is required for methodical thought processes like formal logic.

Humans have a tendency to appreciate measurable System 2 while rejecting mechanical System 1, but reality is far more complicated. When it comes to thinking, these cognitive processes engage in a “division of labor,” and they continually interact. You are used to living in System 1’s world, where quick processing is incredibly efficient. In fact, you can be thinking about a task in System 2 and get tired or distracted. Then you will find yourself in System 1 without realizing it. If you’ve been confused by an optical illusion, you’ve seen what happens when these two systems operate against each other.

Duality and Collaboration

The work you put in determines the system you adopt and how you think. If you’re doing something simple, like walking down a familiar path, you’re employing System 1. When you increase your pace to a speed walk, System 2 activates to maintain your effort. Try to solve an arithmetic problem now, and you’re likely to stop walking entirely; your brain can’t handle the added strain. According to recent laboratory investigations, intense System 2 concentration lowers the body’s glucose levels. When your System 2 is overworked, you are more likely to stereotype, fall to pressure, or analyze matters superficially.

System 1 prefers the obvious solution, thus if a seemingly correct solution appears fast when you encounter a challenge. System 1 will default to it and hold to it. Even if further knowledge proves it incorrect. System 1 engages in “associative activation”. When you combine two words or an image, your mind will connect the dots. And building a story out of those snippets of information. If you hear the word “eat,” you are more likely to pronounce the letters S-O- -P as “soup” than “soap.”

Appeal to people’s System 1 preference for basic, memorable information. If you want to persuade in your reports, use a strong typeface, and make your company’s name easy to remember. These tendencies are expressions of System 1’s wider role, which is to construct and preserve your worldview. System 1 appreciates consistency: The image of a burning car sticks out in your imagination. If you later witness a second car on fire in the same area, System 1 will tell it as “the place where cars catch fire.”

Making Meaning, Making Mistakes

System 1 wants the universe to be interconnected and meaningful. Therefore if you have two separate facts, it will think they are connected. Its goal is to encourage cause-and-effect explanations. Similarly, when you notice a piece of data, your System 1 assumes you have the entire story. The “what you see is all there is” (WYSIATI) tendency has a strong influence on your judgments. For example, if all you have to go on is someone’s appearance, System 1 will fill in the blanks. This is known as the “halo effect”. For example, if an athlete is attractive, you’ll presume he or she is likewise talented.

System 1 is also in charge of “anchoring”. When you unconsciously link your thoughts on a topic to the information you’ve previously encountered, the two have nothing to do with one another. For example, citing the number 10 and then asking how many African countries are members of the UN will yield lower estimates than mentioning 65 and asking the same question. However, System 2 might magnify your errors by providing reasons for the answers and solutions you generate. System 2 does not contradict what System 1 provides; rather, it is an “endorser” of how System 1 attempts to categorize your reality.

Your ability to judge is limited by your natural focusing on the content of a message rather than its significance. People use vivid instances to form their concerns and future plans. For example, media coverage of dramatic but rare events such as accidents and disasters, as opposed to dull but prevalent threats such as strokes and asthma, establishes those events as anchors that individuals use to make massively wrong estimates of where the risks to their health lie.

People also make mistakes when they fail to notice “regression to the mean.” Everything tends to return to the mean over time, but individuals develop and apply “causal interpretations” to what are essentially random events. For example, if a baseball player has a successful first year but then falters in his lost season, sports fans will blame it on a variety of factors — but in truth, the player was probably just more fortunate in his first outings than in later ones.

Distorted Reality and Optimism

The “narrative illusion,” or the mind’s preference for the simple, tangible, and coherent over the theoretical, conflicting, and ambiguous, is at work. People draw meaning from stories that highlight individual attributes such as goodness and competence while overlooking the influence of chance and statistical forces. You will “concentrate on a few striking occurrences that occurred rather than the innumerable events that did not occur.” You will alter reality due to “hindsight bias” by realigning your memory of events to coincide with new information. And when you recount stories about events in which you were engaged, you tend to be too optimistic and predisposed to overvaluing your abilities as comparison to those of others.

This intense, continuous optimism is beneficial to the economy in a variety of ways, because entrepreneurs and inventors are constantly starting new enterprises, despite the huge odds against them. Despite understanding that only around a third of businesses survive to their fifth anniversary, more than 80% of American entrepreneurs believe they have a good chance of beating that number; fully a third “claimed their likelihood of failing was zero.”

Experts and Risk

System 1 determines how frankly people perceive their own “intuition and validity”. It means that not all experts always provide excellent advice. Individual skill, “feedback, and practice” are required for expertise. For example, firefighters’ repeated practice assessing the hazards posed by various types of fires, as well as their expertise fighting those flames, gives them an outstanding ability to read a situation intuitively and spot critical patterns. An anesthesiologist, for example, relies on regular, instantaneous medical feedback to keep a patient safe throughout the operation.

However, don’t put too much faith in expert judgment in industries where challenges vary widely, luck influences achievement. And there is a large gap between action and feedback. Those who forecast stock prices and political elections, for example, are likely to fall into this category. Because System 1 lulls experts into complacency with “fast answers to tough topics,” their intuition may be faulty, but your System 2 is unable to detect such inconsistencies.

When it comes to risk and value decisions, you’re especially prone to hazy thinking. Most people are “loss averse,” which means they would rather lose $100 than win $150. However, financial traders have a less emotional, System 1-type reaction to losses. Individuals are also subject to the “endowment effect,” which states that when something belongs to you, even if just for a short time, you tend to overestimate its value in comparison to the value of items you don’t own. Homeowners are prime examples of the endowment effect, as they frequently overvalue their properties.

When you combine all of this with the reality that people underestimate the likelihood of uncommon events or give rare events too much weight when making decisions, you have the roots of the current insurance industry. The way you phrase risk influences how you evaluate it. For example, if you learn that a life-saving vaccine has “a 0.001 percent risk of lifelong disability,” your reaction is very different than if the same therapy renders one out of 100,000 people permanently disabled. Despite this, the two are identical. When all of these inclinations are considered, it is difficult to believe any economic theory predicated on the assumption that individuals are rational actors.

“Two Selves,” One Mind

Two selves struggle over the quality of your experiences, just as two systems interact in your mind. The “experiencing self” is the part of you that lives your life; the “remembering self” evaluates your experiences, takes lessons from them, and “makes decisions” about the future. Happiness is not cumulative for the remembering self, and the latter stages of any event are essential in your remembrance of its quality. When researchers asked individuals to rank the life of someone who lived happily until the age of 65 vs someone who lived happily until the age of 65 but was only somewhat pleased for the next five years, the subjects evaluated the first life as more desirable.

The review of your life story by your remembering self is one component in determining if you are happy. You rate your life based on the standards or goals you set for yourself. The assessments of your experiencing self in the present time supply the other side of your enjoyment. Because they account for distinct aspects of reality, their findings may disagree. Work benefits and prestige, which influence “overall job satisfaction,” do not influence people’s daily moods at work. Instead, characteristics such as conversing with coworkers and being free of “time pressure” contribute more to satisfaction.

What you pay attention to has a significant impact on your mood. “Active forms of leisure,” such as physical activity or spending time with good friends, satisfy you far more than “passive forms of leisure,” such as watching television. You can’t alter your profession or your personality, but you can change what you focus on and how you spend your time. Focus shapes your self-evaluations: “Nothing in life is as important as you believe it to be when you think about it.”

Your two selves are inextricably linked to your two mental systems: System 2 built your remembering self, but System 1 gave you the ability to judge experiences based on their last moments and to prefer “long pleasures and short sufferings.” The link between your selves has philosophical and policy ramifications. Depending on whether you regard the perspective of the remembering self or the experiencing self as primary, you would make different decisions about which social, health, and economic concerns to address and how to solve them.

In general, understanding how these diverse mental processes function might help you comprehend that the completely rational individuals promoted by the economic theory are fictitious and that real people require assistance in making better financial and life decisions. Understanding how your mind works can help you fight for solutions that address those concerns. The reverse is also true: because your mind does not always function optimally, rules should protect people from those who would “deliberately exploit their deficiencies.” Because individuals find it difficult to detect flaws in their own System 1 processing, an organization can work with greater systematic rationality than its individual members.

Leave a Comment