Thinking Fast and Slow by Daniel Kahneman

Nanachka
12 min readAug 26, 2023

After 3 months of reading this book and 3 days of writing this review, I’m proud to say this is the hardest book I’ve ever read and the longest review I’ve written 😂

Isn’t it cool if you can understand how people think, how biased everyone is when it comes to decision-making, and how words influence our emotions?

On my way to Gyor

Two systems

The main characters of the book:

  • System 1 operates automatically and quickly, with little or no effort, and no voluntary control. For example: Orienting to the source of a sudden sound, answering to 2+2, and understanding simple sentences.
  • System 2 requires attention and effortful mental activities. For example: Filling out a form, answering to 14x54, and maintaining a faster speed than is natural to you.

Both systems are active whenever we are awake. System 1 runs automatically and System 2 is in a low-effort mode. When system 1 runs into difficulty, it calls on system 2 to support more detailed processing that may solve the problem.

When you confront a hard problem that your System 2 should take control of, your pupil dilates, and your heart rate increases. Highly intelligent individuals need less effort to solve the same problem, as indicated by both pupil size and brain activity.

“I won’t try to solve this while driving. This is a pupil-dilating task. It requires mental effort!”

Flow is a state of effortless concentration so deep that people lose their sense of time, of themselves, of their problems. When you are in the flow, you don’t struggle to maintain your focus. You are freeing resources for the task at hand.

The word evokes memories, which evoke emotions, which in turn evoke facial expressions and other reactions. You think with your body, not only with your brain.

You know far less about yourself than you feel you do.

Psychologists think of ideas as nodes in a vast network called associative memory, in which each idea is linked to many others.

If you have recently heard the word EAT, you are more likely to complete SO_P as SOUP than as SOAP. It is a priming effect.

If you were primed to think of old age, you would act old and walk more slowly. The effects of priming can reach into every corner of our lives.

Simple, common gestures can also unconsciously influence our thoughts and feelings. If you nod your head when hearing a message, you tend to accept the message you heard. If you shake your head while hearing the same message, you will tend to reject it. There is no awareness, just a habitual connection. Act calm and kind regardless of how you feel.

“I made myself smile and I’m actually feeling better!”

Money-primed people become more independent and selfish. Living in a culture that surrounds us with reminders of money may shape our behavior and attitudes. System 1 provides the impressions that often turn into your beliefs and is the source of the impulses that often become your choices and actions. System 1 is the origin of many of the systematic errors in your intuitions.

When you are in a state of cognitive ease, you are probably in a good mood, trust your intuition, and feel that the current situation is comfortably familiar. When you feel strained, you are suspicious, feel less comfortable, make fewer errors, and less intuitive.

Causes and consequences of cognitive ease

You experience greater cognitive ease in perceiving a word you have seen recently, it gives you the impression of familiarity. When you are taking a test, choose the answer that feels familiar.

Cognitive strain is experienced when the effortful operations of System 2 are engaged. It is likely to reject the intuitive answer suggested by System 1. When you read a problem in a bad font, your performance is better than reading in a clear font.

Companies with easily pronounceable names do better because they feel familiar, which means people are in cognitive ease.

System 1 is gullible and biased to believe, System 2 is in charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy. People are more likely to be influenced by empty persuasive messages, such as commercials when they are tired and depleted.

Halo effect — The tendency to like (or dislike) everything about a person — including things you have not observed. It exaggerates the idea that good people do only good things and bad people are all bad.

The sequence in which we observe characteristics of a person matters because the halo effect increases the weight of first impressions. An open discussion gives too much weight to the opinions of those who speak early and assertively, causing others to line up behind them.

WYSIATI — What you see is all there is. We don’t wait for more information, we build a story from the information available to you, and if it is a good story, we believe it. System 1 is designed to jump to conclusions from little evidence — and it is not designed to know the size of its jumps. The world makes sense to us because of our almost unlimited ability to ignore our ignorance.

Heuristics and biases

System 1 automatically identifies causal connections between events.

We are pattern seekers, believers in a coherent world. We reject the idea that much of what we see in life is random.

The anchoring effect is a cognitive bias that describes the common human tendency to rely too heavily on the first piece of information offered (the “anchor”) when making decisions. If you consider how much you should pay for a house, you will be influenced by the asking price. Any number that you are asked to consider a possible solution to an estimation problem will induce an anchoring effect. You can’t imagine how you would have thought if the anchor had been different, so you should activate your System 2 to combat the effect.

System 1 understands sentences by trying to make them true and prone to believe strongly whatever we believe.

The world in our heads is not a precise replica of reality; our expectations about the frequency of events are distorted by the prevalence and emotional intensity of the messages to which we are exposed. Estimates of causes of death are warped by media coverage. For example: Death by accident was judged to be more than 300 times more likely than death by diabetes, but the true ratio is 1:4. The danger is exaggerated as the media compete for attention-grabbing headlines. People who try to dampen the increasing fear attract little attention. Fear is painful, and policymakers should protect the public from fear, not only from real dangers.

Base rates matter. Intuitive impressions of the diagnosticity of evidence are often exaggerated. When the evidence is weak, one should stick with the base rates.

“This start-up looks as if it could not fail, but the base rate of success in the industry is extremely low. How do we know this case is different? ”

The Conjunction Fallacy is a fallacy or error in decision-making where a conjunction of two possible events seems more likely than one of the events. Adding detail to a scenario makes them more persuasive, but less likely to come true.

Conjunction Fallacy

Less is more. Adding a positively valued item to a set can only increase its value. Adding a cheap gift to the expensive product will make the whole deal less attractive.

An interesting experiment: One of the participants was having a seizure and had asked for help. Only four of the fifteen participants responded immediately to the appeal for help. Six never left their booth, and five others came out only after the “seizure victim” choked. The experiment shows that individuals feel relieved of responsibility when they know that others have heard the same request for help. Most of us think of ourselves as decent people who would rush to help in such a situation, but we actually expect others to take on the unpleasantness of dealing with a seizure.

Regression to the mean describes how variables much higher or lower than the mean become closer to the mean when measured a second time. Poor performance is typically followed by improvement and good performance by deterioration, without any help from either praise or punishment.

above-average score on day 1 = above-average talent + lucky on day 1 below-average score on day 1 = below-average talent + unlucky on day 1

  • The golfer who did well on day 1 is likely to be successful on day 2 as well, but less than on the first, because the unusual luck he probably enjoyed on day 1 is unlikely to hold.
  • The golfer who did poorly on day 1 will probably be below average on day 2, but will improve because his bad luck is not likely to continue.

Extreme predictions and a willingness to predict rare events from weak evidence are both manifestations of System 1. Your intuitions will deliver predictions that are too extreme and you will be inclined to put too much faith in them. Try to activate your System 2 to act against your intuitions.

“Our intuitive prediction is very favorable, but it is probably too high. Let’s take into account the strength of our evidence and regress the prediction toward the mean.”

Overconfidence

We tend to exaggerate our ability to forecast the future, which fosters optimistic overconfidence. Our tendency to construct and believe coherent narratives of the past makes it difficult to accept the limits of our forecasting ability. Errors of prediction are inevitable because the world is unpredictable.

Everything makes sense in hindsight.

People often take on risky projects because they are overly optimistic about the odds they face. The inside view attempts to make predictions based on an understanding of the details of a problem, and the outside view looks at similar past situations and predicts based on those outcomes.

Planning fallacy:

  • are unrealistically close to best-case scenarios
  • could be improved by consulting the statistics of similar cases

“She’s assuming the best-case scenario, but there are too many different ways for the plan to fail, and she cannot foresee them all.”

Choices

Which one do you choose for each problem?

Problem 1: Get $900 for sure OR 90% chance to get $1,000

Problem 2: Lose $900 for sure OR 90% chance to lose $1,000

You probably chose the sure thing in problem 1 and the gamble in problem 2. People become risk-seeking when all the options are bad.

Risk aversion is the tendency to avoid risk — The investor who chooses the preservation of capital over the potential for a higher return.

Loss aversion. The pain of losing is more powerful than the pleasure of gaining. We are driven more strongly to avoid losses than to achieve gain.

Decision-makers tend to prefer the sure thing over the gamble (they are risk averse) when the outcomes are good. They tend to reject the sure thing and accept the gamble (they are risk-seeking) when both outcomes are negative.

The brain is designed to give priority to bad news. Bad words attract attention faster than happy words. The success of a relationship depends more on avoiding the negative than on seeking the positive.

Availability cascade. An extremely vivid image of death and damage, constantly reinforced by media attention and frequent conversations, becomes highly accessible. This is how terrorism works and why it is so effective. System 2 may know that the probability is low, but this knowledge doesn’t eliminate the self-generated discomfort and the wish to avoid it.

People overestimate the probabilities of unlikely events and overweight unlikely events in their decisions.

People expect to have stronger emotional reactions to an outcome that is produced by action than to the same outcome when it is produced by inaction. This has been verified in the context of gambling: people expect to be happier if they gamble and win than if they refrain from gambling and get the same amount.

We spend much of our day anticipating and trying to avoid, the emotional pains we inflict on ourselves.

Regret and hindsight bias will come together, so try to avoid hindsight by making decisions with long-term consequences.

Emotional framing. When you see the information “The one-month survival rate is 90%. There is 10% mortality in the first month”, even though the two sentences address the same situation, you will feel the former is better and the latter is worse. System 1 is influenced hugely by emotional words: mortality is bad, survival is good and 90% surviving sounds encouraging whereas 10% mortality is frightening.

Two selves

Which patient suffered more?

Of course, patient B had the worst time. When the procedure was over, all participants were asked to rate “the total amount of pain” they experienced. Here are the two findings:

  • Peak-end rule: The rating was predicted by the average level of pain reported at the worst moment of the experience and at its end.
  • Duration neglect: The duration of the procedure had no effect whatsoever on the ratings of total pain.

The worst rating was the same for both patients, but the last rating before the end of the procedure was 7 for patient A and only 1 for patient B. The peak-end average was therefore 7.5 for patient A and only 4.5 for patient B. Patient A retained a much worse memory of the episode than patient B.

The experiencing self is the one that answers the question: “Does it hurt now?” The remembering self answers the question: “How was it, on the whole?” The experiencing self does not have a voice. The remembering self is sometimes wrong, but it is the one that keeps score and governs what we learn from living and it is the one that makes decisions.

We want pain to be brief and pleasure to last. But our memory, a function of System 1, has evolved to represent the most intense moment of an episode of pain or pleasure (the peak) and the feelings when the episode was at its end. A memory that neglects duration will not serve our preference for long pleasure and short pains.

The neglect of duration combined with the peak-end rule causes a bias that favors a short period of intense joy over a long period of moderate happiness.

“You are thinking of your failed marriage entirely from the perspective of the remembering self. The fact that it ended badly does not mean it was all bad.”

Conclusion

The author Daniel Kahneman, winner of the Nobel Prize, did a great amount of research and experiments throughout his life with his partner Amos, who died before the release of this book. It is very interesting to understand how and why we make the choices we make. No matter if you’re a psychologist or a normal person, we all are influenced by the halo effect, our false intuition, lazy System 2, and many others explained in the book. By knowing these effects, we can change (at least try to) the way we think, and ignore the media that always highlights bad and rare events.

I left some important concepts such as mental shotgun, intensity matching, and the endowment effect, so if you are interested check out them too. If you read until this without skipping, I’m proud of you and thank you so much!

~2023.08.26 bored nana in a hospital room

--

--

Nanachka

Book reviews and journals. Jai guru deva, om. Nothing's gonna change my world 🌝🌚