Thinking, Fast and Slow, by Daniel Kahneman

From LessWrong
Jump to navigation Jump to search

This is a summary of Daniel Kahneman's Thinking, Fast and Slow by Less Wrong user Gleb_Tsiupursky. It has very extensive notes, along with his assessment, of the book, and its usefulness to him. Feel free to optimize the article based on your own notes as well.

Daniel Kahneman, Thinking, Fast and Slow (New York: Farrar, Straus and Giroux, 2011)

Thesis:

The author explains the two systems that drive the way we think. System 1 is fast, intuitive, and emotional; it operates automatically and quickly, with little or no effort and no sense of voluntary control. System 2 is slower, more deliberative, and more logical. It allocates attention to the effortful mental activities that demand it.

The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration. When we think of ourselves, we identify with System 2, the conscious, reasoning self that has beliefs, makes choices, and decides what to think about and what to do. Yet this is hardly the case. System 1 runs automatically and System 2 is normally in a comfortable low-effort mode, in which only a fraction of its capacity is engaged. System 1 continuously generates suggestions for System 2: impressions, intuitions, intentions, and feelings. If endorsed by System 2, impressions and intuitions turn into beliefs, and impulses turn into voluntary actions. When all goes smoothly, which is most of the time, System 2 adopts the suggestions of System 1 with little or no modification.

When System 1 runs into difficulty, it calls on System 2 to support more detailed and specific processing that may solve the problem of the moment. System 2 is also activated when an event is detected that violates the model of the world that System 1 maintains. System 2 is also credited with the continuous monitoring and control of your own behavior. System 2 is mobilized to increased effort when it detects an error about to be made. System 2 has some ability to change the way System 1 works, by programming the normally automatic functions of attention and memory. The division of labor between System 1 and System 2 is highly efficient: it minimizes effort and optimizes performance. The arrangement works well most of the time because System 1 is generally very good at what it does. System 1 has biases, however, systematic errors that it is prone to make in specified circumstances. One of the tasks of System 2 is to overcome the impulses of System 1. In other words, System 2 is in charge of self-control. In summary, most of what you (your System 2) think and do originates in your System 1, but System 2 takes over when things get difficult, and it normally has the last word.

Because System 1 operates automatically and cannot be turned off at will, errors of intuitive thought are often difficult to prevent. Biases cannot always be avoided, because System 2 may have no clue to the error. Even when cues to likely errors are available, errors can be prevented only by the enhanced monitoring and effortful activity of System 2. As a way to live your life, however, continuous vigilance is not necessarily good, and it is certainly impractical. The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high.

Brief Summary

In this work the author, a recipient of the Nobel Prize in Economic Sciences for his seminal work in psychology that challenged the rational model of judgment and decision making, has brought together his many years of research and thinking in one book. The author aims to introduce into everyday conversations a better understanding of the nature of and the systematic errors in our judgment, choice, and behavior. The author explains the two systems that drive the way we think. System 1 is fast, intuitive, and emotional; it operates automatically and quickly, with little or no effort and no sense of voluntary control. System 2 is slower, more deliberative, and more logical. It allocates attention to the effortful mental activities that demand it. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration. When we think of ourselves, we identify with System 2, the conscious, reasoning self that has beliefs, makes choices, and decides what to think about and what to do. Although System 2 believes itself to be where the action is, the automatic System 1 is the hero of the book. This makes our general impression of human beings as rational actors, especially as prevalent in economics but more broadly public discourse, here at odds with reality.

System 1 runs automatically and System 2 is normally in a comfortable low-effort mode, in which only a fraction of its capacity is engaged. System 1 is an associative, meaning-making machine, which offers a tacit interpretation of what happens to you and around you, linking the present with the recent past and with expectations about the near future. It contains the model of the world that instantly evaluates events as normal or surprising, and provides the source of your rapid intuitive judgments. System 1 continuously generates suggestions for System 2: impressions, intuitions, intentions, and feelings. These are often endorsed by System 2, and as a result impressions and intuitions turn into beliefs, and impulses turn into voluntary actions. When all goes smoothly, which is most of the time, System 2 adopts the suggestions of System 1 with little or no modification. You generally believe your impressions and act on your desires, and that is fine—usually. When System 1 runs into difficulty, it calls on System 2 to support more detailed and specific processing that may solve the problem of the moment. System 2 is mobilized when a question arises for which System 1 does not offer an answer. You can also feel a surge of conscious attention whenever you are surprised. System 2 is activated when an event is detected that violates the model of the world that System 1 maintains. System 2 is also credited with the continuous monitoring of your own behavior—the control that keeps you polite when you are angry, and alert when you are driving at night. System 2 is mobilized to increased effort when it detects an error about to be made. In summary, most of what you (your System 2) think and do originates in your System 1, but System 2 takes over when things get difficult, and it normally has the last word. System 2 has some ability to change the way System 1 works, by programming the normally automatic functions of attention and memory.

The division of labor between System 1 and System 2 is highly efficient: it minimizes effort and optimizes performance. The arrangement works well most of the time because System 1 is generally very good at what it does: its models of familiar situations are accurate, its short-term predictions are usually accurate as well, and its initial reactions to challenges are swift and generally appropriate. System 1 has biases, however, systematic errors that it is prone to make in specified circumstances. System 1 uses simplifying heuristics, rules of thumb, to make difficult judgments, which causes predictable biases in their predictions. System 1 relies to a great extent on emotions, so these play a surprisingly important role in our judgments and choices and decisions, an example of what is called the affect heuristic, where judgments and decisions are guided directly by feelings of like and dislike, with little deliberation or reasoning. An essential element of System 1 intuitive heuristics is that when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution, and sometimes end up with a mistaken answer to the question posed. System 1 easily thinks associatively, metaphorically, and causally, but it has little understanding of logic and statistics, and frequently makes mistakes in judgment about these. System 1 encourages us to be overconfident in what we know, and fails to acknowledge the full extent of our ignorance and insurgency about the world, as well as the role of chance events. One further limitation of System 1 is that it cannot be turned off. One of the tasks of System 2 is to overcome the impulses and habitual patterns of System 1. In other words, System 2 is in charge of self-control.

The defining feature of System 2 is that its operations are effortful, and one of its main characteristics is laziness, a reluctance to invest more effort than appears strictly necessary. A general “law of least effort” applies to cognitive exertion, which is part of a broader pattern of human actions. The law asserts that if there are several ways of achieving the same goal, people will gravitate to the least demanding course of action – and this applies to cognitive effort as well as other forms of human activities. We normally avoid mental overload by dividing our tasks into multiple easy steps, committing his immediate results too long-term memory workpaper rather than an easily overloaded working memory. System 2 effort is required in a variety of activities, for example maintaining simultaneously in memory several ideas that require separate actions, or that need to be combined according to a specific rule. System 2 is the only one that can follow rules, compare objects on several attributes, and make deliberate choices between options. System 2 effort is required to reprogram System 1 to avoid habitual practices and impulses. Switching from one task to another is effortful for System 2, especially under time pressure. As a consequence, the thoughts and actions that System 2 believes it has chosen are often guided by System 1, such as answering easier questions that the ones actually asked.

The often-used phrase “pay attention” is apt: you dispose of a limited budget of attention that you can allocate to activities, and if you try to go beyond your budget, you will fail. It is the mark of effortful activities that they interfere with each other, which is why it is difficult or impossible to conduct several effortful activities at once. All variants of voluntary effort – cognitive, emotional, physical – draw at least partly on a shared pool of mental energy. People who are challenged simultaneously by demanding cognitive tasks and by temptation are more likely to yield to the temptation. Furthermore, people who are cognitively busy are more likely to make selfish choices, use sexist language, make superficial judgments in social situations, and generally be less thoughtful. Experiments are shown that an effort of will or self-control is tiring. If you have to force yourself to do something, you are less willing or able to exert self-control when the next challenge comes around, a phenomenon called ego depletion. In other words, activities that impose high demands on System 2 requires self-control, and the exertion of self-control is depleting and unpleasant. Ego depletion is at least in part a result of the loss of motivation: after exerting self-control in one task, you are less motivated to make an effort in another, although you could if you are given a strong incentive to do so. The nervous system consumes more glucose than most other parts of the body, and effortful mental affinity appears to be especially expensive in the currency of glucose. Thus, experiments have shown that the effects of people depletion can be undone by eating glucose. Fortunately, mental work is not always so straining, and people sometimes spent considerable effort for long periods of time without having to exert extensive willpower, a state that has received the term flow. Flow separates two forms of effort, concentration on the task and the deliberate control of attention, with the state of flow involving the former without the effort of the latter.

People frequently engage in lazy thinking, putting too much faith and confidence in their intuition, and avoiding use of System 2 thinking. They find cognitive effort somewhat unpleasant and avoid it as much as possible. However, these tendencies vary among individuals, and can be changed as a result of deliberate education and effort, especially since they are in part dependent on motivation, of a lack of desire to try hard enough. Those who avoid intellectual laziness could be called more engaged, rational thinkers, being more alert, more intellectually active, less willing to be satisfied with superficially attractive answers, and are more skeptical about their intuitions. One psychologist, Keith Stanovich, draws a distinction between two parts of System 2. One deals with slow and effortful thinking and demanding computation. Some people are better than others in this task, they are the individuals work cell in intelligence tests and are able to switch quickly and efficiently from one task to another. Yet such high intelligence does not make people immune to biases. Another ability is involved, what that psychologist labelled rationality and what Kahneman terms being engaged, which is distinct from intelligence as such.

There is a distinction between two selves, the experiencing self and the remembering self. Thus, if we expose individuals to two painful episodes, we can manipulate their memories so that the episode which is more painful because it is longer is remembered as better. When people later decide what episode to repeat, they are guided by the remembering selves and expose their experiencing self to unnecessary pain. The same goes for happiness.

People generally prefer cognitive ease and thus to make a stronger impact in having people remember and believe your message, you should strive to minimize cognitive strain as part of your message. For example, a message that is printed in a clear font, with clear colors, in clear and simple language, in a rhyming verse, and with easily-understandable references, will make a bigger impact because it will be cognitively easier for people to process. Beside, a message that has been frequently repeated, has been primed, or is heard by people in a good mood makes a stronger impact because it will be cognitively easier for people to process. Cognitively easier messages will feel more true, familiar, good, and effortless, and thus make a stronger impact.

Because System 1 operates automatically and cannot be turned off at will, errors of intuitive thought are often difficult to prevent. Biases cannot always be avoided, because System 2 may have no clue to the error. Even when cues to likely errors are available, errors can be prevented only by the enhanced monitoring and effortful activity of System 2. As a way to live your life, however, continuous vigilance is not necessarily good, and it is certainly impractical. The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high. The premise of this book is that it is easier to recognize other people’s mistakes than our own.


My Assessment: (personal thoughts on what is relevant for me)


• I find the division into System 1 of fast, automatic, intuitive, constant, and emotional thinking and System 2 of slower, more deliberative, and more logical thinking is helpful and thought provoking.

o I think my own thinking about my thinking is certainly clarified by conceptualizing this division into automatic, intuitive thinking and effortful, conscious thinking processes. It helps explain why and how I live the vast majority of my life through habitual, patterned processes, and only a small proportion by more conscious processes.

o Although I overall find this division helpful and convincing, I do wonder about the seemingly sharp boundaries drawn between these. Perhaps there is less of sharp break than Kahneman envisions?

• I found helpful the observation that the operations of System 2 are often associated with the subjective experience of agency, choice, and concentration, and that when we think of ourselves, we identify with System 2, namely the conscious, reasoning self that has beliefs, makes choices, and decides what to think about and what to do. Yet as the author points out, System 1 plays a very powerful role in our thinking and behavior, one that is vastly underestimated by us.

o This helps me be more at peace with the irrationalities in my own thinking and behavior. If I perceive my own self and identity as System 2, but actually operate as a combination of System 1 and System 2, then I can more easily envision why I have faults and errors and biases in my thinking.

o This observation by the author also helps me recognize and confirm my earlier impression that human beings are complex, messy creatures, with a mixture of motivations and impulses driving our behavior. If our view of ourselves is as System 2 but we operate as both System 1 and System 2 creatures, this illuminates why our actual conduct far from always matching our stated goals and intentions, or our view of our self-identities.

• I think the author makes an important point when saying that the defining feature of System 2 is that its operations are effortful, and one of its main characteristics is laziness, a reluctance to invest more effort than appears strictly necessary. A general “law of least effort” applies to cognitive exertion, which is part of a broader pattern of human actions. He notes that as a consequence, the thoughts and actions that System 2 believes it has chosen are often guided by System 1, such as answering easier questions that the ones actually asked.

o Recognizing this can be helpful for me in analyzing myself and my patterns of behavior. If my System 2 prefers laziness, then this helps explain why I make cognitive errors in my thinking and engage in habitual patterns of thought and behavior. Knowing this can help motivate me to engage in more thorough System 2 thinking, to put more effort into this area, to find meaning and purpose in doing so, to make it more of a priority in my life.

• I think the author makes a valuable points about ego depletion as a result of effortful System 2 tasks, whether solving a challenging problem or imposing self-control such as resisting temptation or making oneself do something one does not want to do. The fact that people depletion defense of motivation is also important to know. The finding that one can use glucose to restore depleted mental energy is also valuable.

o This makes me recognize more the importance of monitoring my own ego depletion over time, and to balance my activities to avoid excessive ego depletion. Knowing that ego depletion can be fought by stronger motivation is also helpful.

• I find intriguing and highly plausible the idea that many people frequently engage in lazy thinking, putting too much faith and confidence in their intuition and avoiding effortful System 2 thinking, and that these tendencies vary among individuals, and can be changed as a result of deliberate education and effort, especially since they are in part dependent on motivation, of a lack of desire to try hard enough. I think the statement is fair that those who avoid intellectual laziness could be called more engaged, rational thinkers, being more alert, more intellectually active, less willing to be satisfied with superficially attractive answers, and are more skeptical about their intuitions. I also find convincing the idea of a distinction between two parts of System 2, the intelligence-oriented one that deals with slow and effortful thinking and demanding computation, and the one that deals with rationality and engagement, namely paying effortful cognitive attention to problematic System 1 thinking and trying to avoid such problems.

o This finding makes me even more convinced of the value of trying struggle begins lazy System 1 thinking, and avoiding putting excessive faith and confidence in my intuition. Instead, I need to be more mindful of how my habitual, System 1 thinking can lead to systematic errors in judgments, decisions, and behavior, and engage in continual System 2 monitoring of my System 1 thinking, especially in more challenging or difficult decisions. Slow down and give myself time and put effort into making better decisions and judgments on these occasions, especially when I notice when something does not fit my current mental models – this should be a sign for me that my System 1 is mistaken.

o The distinction between intelligence and rationality should be something I keep in mind. Intelligent people, people who are capable of engaging in extensive and systematic deep thoughts and complex analysis, are still not necessarily going to be rational. They may well be satisfied by superficially attractive answers, engage in emotional thinking, not be skeptical about their intuitions, accept arguments and evidence that is unsound if it supports their beliefs, and engage in other types of cognitive errors. Similarly, people who are rational may not be highly intelligent, in the sense of finding it difficult to engage in extensive and systematic deep thoughts and complex analysis. Of course, intelligence and rationality are more often in tune than not, with intelligent people being more rational and rational people being more intelligent.

 Understanding the difference between rationality and intelligence should make me question my preconceived notions of how I judge people’s mental qualities.

o Remember that people are quite diverse in their engagement with System 2 thinking, and that they differ on a variety of spectrums, for example intelligence and rationality.

o The author has too skeptical a notion of lazy thinking, I believe. I am more optimistic that rationality and engaged, mindful thinking is something that can be taught. Indeed, I see one of my goals in life as improving the critical thinking of my students and the broader public, and critical thinking overlaps greatly with rationality. Intelligence is also something that can be taught and improved, I believe, and is a component of critical thinking, although to a lesser extent than rationality. Intelligence, in the sense of deep thought and complex analysis, is certainly something that students gain from education, especially in college.

• He makes a valuable observation that because System 1 operates automatically and cannot be turned off at will, errors of intuitive thought are often difficult to prevent. Biases cannot always be avoided, because System 2 may have no clue to the error. Even when cues to likely errors are available, errors can be prevented only by the enhanced monitoring and effortful activity of System 2. The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high.

o This makes me value even more the point of Rationality and learning about people’s and mine cognitive biases and failure modes. He has a good notion of what an appropriate compromise is, I think.


Especially Important Materials From Chapters (what I perceived as most important)

• A general “law of least effort” applies to cognitive exertion, which is part of a broader pattern of human actions. System 2 effort is required in a variety of activities, for example maintaining simultaneously in memory several ideas that require separate actions, or that need to be combined according to a specific rule. System two is the only one that can follow rules, compare objects on several attributes, and make deliberate choices between options. The automatic system one does not have these capabilities. System one is good at simple relations and is good at integrating information about one thing, but it does not deal well with multiple distinct topics at once nor is it adept at using statistics. A key capability of system two years the ability to reprogram system one to override habitual responses. A key discovery of cognitive psychologists in recent decades is that switching from one asked to another is effortful, especially under time pressure.

• Both self-control and cognitive effort are forms of mental work. People who are challenged simultaneously by demanding cognitive tasks and by temptation are more likely to yield to the temptation and generally be less thoughtful.

• All variants of voluntary effort – cognitive, emotional, physical – draw at least partly on a shared pool of mental energy. Thus, experiments are shown that an effort of will or self-control is tiring. If you have to force yourself to do something, you are less willing or able to exert self-control when the next challenge comes around, a phenomenon called ego depletion. In other words, activities that impose high demands on System 2 requires self-control, and the exertion of self-control is depleting and unpleasant. Unlike cognitive load, ego depletion is at least in part a result of the loss of motivation. After exerting self-control in one activity, you are less motivated to make an effort in another, although you could if you really had to.

• Fortunately, mental work is not always so straining, and people sometimes spent considerable effort for long periods of time without having to exert extensive willpower, a state that has received the term flow. Flow separates two forms of effort, concentration on the task and the deliberate control of attention, with the state of flow involving the former without the effort of the latter.

• Surprisingly, the idea of mental energy is more than a metaphor. The nervous system consumes more glucose than most other parts of the body, and effortful mental affinity appears to be especially expensive in the currency of glucose. Thus, experiments have shown that the effects of people depletion can be undone by eating glucose.

• People frequently engage in lazy thinking, putting too much faith and confidence in their intuition, and avoiding use of System 2 thinking. They are defined cognitive effort somewhat unpleasant and avoided as much as possible. Furthermore, when people believe the conclusion is true, they are also very likely to believe arguments and evidence that appears to support it, even when these arguments and evidence are unsound.

• However, these tendencies vary among individuals, and can be changed as a result of deliberate education and effort, especially since they are in part dependent on motivation, of a lack of desire to try hard enough. Those who avoid intellectual laziness could be called more engaged, rational thinkers, being more alert, more intellectually active, less willing to be satisfied with superficially attractive answers, and are more skeptical about their intuitions. One psychologist, Keith Stanovich, draws a distinction between two parts of System 2. One deals with slow and effortful thinking and demanding computation. Some people are better than others in this task, they are the individuals work cell in intelligence tests and are able to switch quickly and efficiently from one task to another. Yet such high intelligence does not make people immune to biases. Another ability is involved, what that psychologist labelled rationality and what Kahneman terms being engaged, which is distinct from intelligence as such.

• Psychologists think of ideas as notes in a vast network called associative memory, in which each idea is linked to many others, and in turn linked to emotions and physical responses. Associative activation is a process in which ideas that have been activated will trigger many other ideas in a branding cascade of associative activity in your brain. All of this happens quickly and nearly at once, resulting in a self-reinforcing pattern of cognitive, emotional, and physical responses that is both papers and integrate, which has been called associatively coherent. Most of these activities happen below the level of consciousness, and are undertaken automatically by our system one; we know far less about our mental world than we think we do.

• Priming effect refers to how exposure to one idea results in an immediate and measurable change in how easy it is to undertake the cognitive activation of many other associated ideas. This is because of the associative process, where an activated idea prepares your mind to activate other ideas: these ideas, which are now cognitively easier to activate, are called prime ideas. Furthermore, these first-level primed ideas have some ability to prime related ideas, although more weekly. Priming impacts not only concepts, but also emotions and psychosomatic effects. Vice versa, emotions and psychosomatic effects prime ideas. Studies of priming effects show that we are powerfully impacted in our thinking, judgment, choices, and behavior by external factors of which we are not aware consciously, which challenges our self image as conscious and autonomous authors of our judgments and choices. To some extent, priming can be fought with if we are aware of the nature of priming effects and its impact on our mental life, as well as pay attention to priming effects in our environment.

• System 1 provides the impressions that often turn into your beliefs, and it is the source of the impulses that often become your choices and actions. It is an associative, meaning-making machine, which offers a tacit interpretation of what happens to you and around you, linking the present with the recent past and with expectations about the near future. It contains the model of the world that instantly evaluates events as normal or surprising. It is the source of your rapid intuitive judgments, which are often precise but also have systematic errors, and it does most of this without your conscious awareness of its activities.

• Cognitive ease determines a great deal about our mental life. Generally, cognitive ease is assigned that things are going smoothly and there are no problems, threats, major news, and no need to redirect attention or mobilize effort. Strain indicates that a potential problem exists, which will require increased mobilization of effort and attention, thus System 2 thinking. Our minds are wired in such a way as to minimize cognitive strain and thus avoid activation of System 2, making it in effect a lazy controller.

• People generally prefer cognitive ease and thus to make a stronger impact in having people remember and believe your message, you should strive to minimize cognitive strain as part of your message. For example, a message that is printed in a clear font, with clear colors, in clear and simple language, in a rhyming verse, and with easily-understandable references, will make a bigger impact because it will be cognitively easier for people to process. Beside, a message that has been frequently repeated, has been primed, or is heard by people in a good mood makes a stronger impact because it will be cognitively easier for people to process. Cognitively easier messages will feel more true, familiar, good, and effortless, and thus make a stronger impact.

• The various causes of ease or strain have interchangeable effects. When you are in a state of cognitive ease, it tends to be in a good mood, like what you see, belief with you learn, trust your intuitions and instincts, and feel that the current situation is comfortable and familiar, and also be relatively casual and superficial in your thing. When you are strain, you are more likely to be vigilant and suspicious, invest more effort in what you’re doing, feel less comfortable, make fewer errors, but also be less intuitive, less creative, and in a worse mood.

• Predictable systematic mistakes inevitably occur if a judgment is based on an impression of cognitive ease or strain. Anything that makes it easier for the associative machine to run smoothly will also bias beliefs. A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth, as propagandists and marketers have discovered.

• To figure out whether a statement is true or just has the illusion of truth due to cognitive ease is challenging but possible, requiring you to apply System 2 thinking to examine whether the statement is linked strongly by logic to other believes or preferences you hold, or comes from a source that you trust.

• The experience of cognitive ease is characteristic of system one thinking while cognitive strain results in the (reluctant) mobilization of system two thinking. For example, people make less mistakes when messages are harder to read by being printed in a smaller and less clear font, because they engage more in system two thinking to process the content.

• Mere exposure effect refers to the fact that repetition results in cognitive ease, inducing a comfortable sense of familiarity and thus appears true. Mere exposure functions even when the person is not aware of being exposed to a stimulus, showing that it happens within system one.

• We have norms for a vast number of categories, and these norms provide the background for the immediate detection of anomalies. This forms the basis of stereotypical, normalizing thinking.

• System 1 is adept at finding a coherent causal story that links the fragments of knowledge at its disposal, sometimes incorrectly so. The automatic search for causes shapes our thinking and powerful and sometimes problematic ways.

• Human brains are wired to have impressions of cause and effect, which do not depend on reasoning about patterns of cause and effect but are a consequence of system one thinking about norms and causes, resulting in impressions of causality rather than analytic understandings of causality based on logical argumentation.

• Your mind is ready and even eager to identify agents, assigned and personality traits specific intentions, and even their actions as expressing individual propensities. This is why stories make such strong impressions on so many people.

• People are wired for causal thinking and inappropriately apply causal thinking to situations that require statistical reasoning. Unfortunately System 1 does not have the capacity for statistical reasoning, and while System 2 can learn to think statistically, you people received the necessary training or undertake the necessary effort.

• When information is scarce, system one operates as a machine for jumping to conclusions. The combination of a coherence-seeking system one with a lazy system two implies that system two will endorse many intuitive beliefs based on impressions generated by system one. In the process of jumping to conclusions, often System 1 makes a definitive choice and you are only aware of the option presented by System 1, without being aware of the original ambiguity possibility for multiple interpretations. System 1 does not keep track of alternatives that it rejects or even of the fact that there were alternatives.

• System one is gullible and biased to believe, system two is in charge of doubting and believing. If system two is otherwise engaged, or due to the inherent laziness of system two, we are much more gullible and biased to believe messages.

• Confirmation bias refers to how we seek data that are likely to be compatible with the beliefs we currently hold, and avoid or reject information that does not conform to our beliefs.

• Halo effect refers to the tendency that if you like one thing about a person or concept, you will be more likely to like other qualities about a person or concept, accepting evidence and arguments that indicate positive things about their personal concept and rejecting ones that do not.

• Decorrelating error refers to the need to get a large sample size or diverse inputs to reduce the likelihood of and minimize errors.

• Our associative machine represents only activated ideas. Information that is not retrieved from memory, consciously or unconsciously, might as well not exist.

• System one excels at constructing the best possible story that incorporates ideas currently activated, but it does not allow for information it does not have. The measure of success system one is the coherence of the story it manages to create. The amount and quality of the data on which the story is based are largely irrelevant to system one. This is why stories make such powerful impact and why we tend to jump to conclusions based on the stories we create, as opposed to analytical reasoning.

• The author created an abbreviation for jumping to conclusions on the basis of limited evidence: WYSIATI, what you see is all there is. WYSIATI explains along and diverse list of biases, including: o Overconfidence: in that neither the quantity nor the quality of the evidence counts for much in subjective confidence that results from system one thinking. The overconfidence that individuals have depends mostly on the quality and coherence of the story they can tell about what they observe, as human minds do not naturally consider the possibility that there is key evidence that we might be missing. o Framing effects: different ways of presenting information result in different emotions and associative chains, resulting in improper conclusions. o Base-rate neglect: system one tends to focus on new evidence presented to it, and forgets about the prior rate of occurrence.

• System one continuously monitors what is going on outside and inside the mind, and continuously generates assessments of various aspects of the situation without specific intention and with little or no effort. These basic assessments play an important role in intuitive judgment, because they are easily substituted for more difficult questions – this is the essential idea of the heuristics and biases approach.

• System one represents categories by a prototype for a set of difficult examples or an average, but it deals poorly with sums and with overall estimation of quantities in categories. System 1 easily makes comparisons and judges intensity, although it makes systematic errors in statistical and computational reasoning that often make these judgments problematic.Thus, emotional and intuitive reactions are often poorly correlated with numerical quantities and statistics, but better correlated with averages and somewhat correlated with comparisons.

• Substition refers to the operation of system one of substituting an easier and related question for a harder question that requires cognitive effort to answer. System one processes often make available answers the easy questions that could be mapped onto the harder target questions. On some occasions, substitution will occur and a heuristic answer will be endorsed by System 2 and form the basis for judgment, belief, and action. While system to have the opportunity to reject this execution, the cognitive effort required will lead many people to engage in lazy thinking and not scrutinize this substitution, with many people not even noticing that a substitution occurred and perhaps not even realizing that the target question was difficult, because an intuitive answer to it came readily to mind.

• Affect heuristic refers to how people let their likes and dislikes, their attitudes, determine their beliefs about reality and the accuracy of information and arguments. System two functions in many cases as more of an apologist and justifier for the emotional and intuitive impressions of System 1 than a critic of System 1 thinking. • General characteristics of System 1: o generates impressions, feelings, and inclinations  when endorsed by system two, these become attitudes, intentions, and voluntary actions o operates automatically quickly, with no sense of effort or voluntary control o can be reprogrammed by system two with effort o after adequate training to gain expertise, it can execute skilled responses and intuitions o create a coherent pattern of activated ideas in associative memory o links a sense of cognitive ease to illusions optional, pleasant feelings, and reduced vigilance o distinguishes the surprising from the normal o infers and invents causes and intentions o neglects ambiguity and suppresses doubt o is biased to believe and confirm o exaggerates emotional consistency, resulting in the hell of effect o focuses on existing evidence and ignores facts and evidence: what you see is all there is, WYSIATI o generates a limited set of basic assessments o represent sets by norms and prototypes, does not integrate or engage well in computational or statistical thinking o neglects base rate and focuses on new evidence – base-rate neglect o matches intensities across scales and categories – intensity matching o computes more than intended – mental shotgun o can substitute an easier question for a difficult one – heuristics o is more sensitive to changes then to existing states – prospect theory o overweighs low probabilities and underestimates impact of small sample size and benefit of decorrelating errors o shows diminishing sensitivity to quantity – psychophysics o response more strongly to losses into games – loss aversion o we frame decision problems narrowly, in isolation from one another

• Small samples yield extreme results more often than large samples to. Therefore large samples are more precise than small samples. Yet even if we are aware of this, our System 1 thinking is inherently not computational and statistical, but coherence-seeking and story-oriented, resulting in even small sample sizes being used as adequate explanations that cause us to believe the content of messages. The strong bias toward believing that small samples closely resembled the population from which they are is a part of a larger pattern of our minds being thrown to exaggerating the consistency and coherence of what we see. System one inevitably runs ahead of the facts to construct a rich image and cohesive story on the basis of inadequate scraps of evidence, jumping to conclusions and producing a representation of reality that makes too much sense.

• The associative machinery of system 1 seeks causes. However the difficulty we have with statistical irregularities is that they call for a different approach, informed by statistical and computational and chance-oriented thinking. It is very challenging for system one to acknowledge that nothing in particular caused something to be what it is and that chance selected it from among a variety of alternatives: system one engages in pattern seeking and coherence making, and this predilection for causal, story, and pattern-oriented thinking exposes us to serious mistakes in evaluating random events. System one does not expect to see irregularities but used by a random process, despite the statistical likelihood of such occurring. As a result when we detect what appears to be a pattern, system one quickly rejects the idea that the process is truly random, and searches for a causal explanation. We are far too willing to reject the belief that much of what we see in life is random.

• Unless the message is immediately negated in our minds, the associations that evoke will spread in the mind as if the message was true.

• System two is capable of doubt and uncertainty, but sustaining doubt is harder mental work and sliding into certainty, which is characteristic of system one thinking.

• The exaggerated faith in small samples is one example of a more general systematic error, namely that we pay more attention to the content of messages then to information about their reliability and as a result and of without you of the world that is simpler and more coherent than the data justify.

• Anchoring effect occurs when people consider a particular value for an unknown quantity before estimating that quantity, which results in the estimate staying close to the number that people consider. Anchoring functions through two mechanisms.

o One is an adjust-and-anchor heuristic as a strategy for estimating uncertain quantity where you start from a given number, assess whether it is too high or too low, and gradually adjust your estimate by mentally moving away from the anchor. The adjustment typically ends prematurely, because people stopped when they are no longer certain that they should move further. This implies that we should move further than we thought we should when dealing with an anchoring situation, and insufficient adjustment is a failure of a lazy System 2.

o Anchoring also occurs as a priming effect, based on suggestion. System one understands ideas by trying to make them true and assessing them in that way, and even if the idea is rejected as an show, compatible ideas are activated that subsequently influence our estimate of what is the reality.

o Fighting with anchoring requires a deliberate activation of system two to make an immediate argument that rejects the anchor. You should assume that any number that is originally presented as an anchoring effect on you, whether you are aware of it or not, and if you decided worthwhile you need to activate system to struggle with this anchor.

• Availability heuristic is the process of judging frequency and significance by the ease with which instances come to mind. That availability heuristic substitutes and easier question for another: you wish to estimate the size of a category or the frequency of an event, but you report an impression of the ease with which instances come to mind. o A salient event that attracts your attention will be easily retrieved from memory. o A dramatic event temporarily increases the availability of its category. o Personal experiences, images, and vivid examples are more available than incidents that happen to others, words, or statistics. This is why stories, images, and clear examples are good tools of communication.

• Awareness of availability and other biases can contribute to success in joint projects and relationships and team dynamics. One bias that is a problem in team dynamics is that many members of a collaborative team tend to feel that they have done more than their share and feel that the others are not adequately grateful for their individual contributions. Solving this requires thinking others acknowledging them more than you think is natural and appropriate. Also remember that many team members are likely to feel that they do more than their fair share: most tends to feel about average.

• Above average bias: most people tend to feel above average.

• Ratings are dominated by the ease with which examples come to mind, and the experience of fluent retrieval of instances is more important than the number retrieved for assessment. In other words, people who are asked to retrieve more instances of something and find it increasingly hard to do so will end up being less confident in their actual judgment of the rating, despite retrieving more instances.

• The world in our heads is not a precise replica of reality, since our expectations about the frequency of events are distorted by the prevalence and emotional intensity of the messages to which we are expose, especially through the media: availability heuristic.

• The affect heuristic is an instance of substitution, in which an answer to an easy question – how do I feel about it – serves as an answer to a much harder question – how should I think about it logically? The affect heuristic causes people to link the benefits and the risks of something and resume the those things with more benefits have less risks and vice versa, despite that generally not being the way the world works. Consistent affect is essential element of associative coherence and a simplified view of the world that is much simpler and tidier than actual reality, characteristic of system one thinking.

• People tend to be guided by emotion rather than reason, easily swayed by trivial details, and inadequately sensitive to differences between low and negligibly low probabilities. Experts show many similar biases to the rest of us, but in weaker form and also are influenced by their professional disciplinary distortions.

• The evaluation of risk depends in the choice of a measure, with the possibility that the choice may be guided by the preference for one outcome for another.

• The availability cascade is a self-sustaining chain of events in which biases flow into public policy: often they start from media reports over relatively minor event that gets overblown and leads to public panic and resultant government action.

• Probability neglect refers to how we either tend to ignore small risks altogether or give them far too much weight, distorting our judgments.

• Representativeness refers to the similarity of a description to our typical stereotype image.

• When asked to calculate probability, people tend to substitute a judgment of representativeness instead of considering other valuable information, most notably base rates. They also tends to ignore the quality of evidence in making probability assessments.

• Generally, anchor your judgment of the probability of an outcome on a plausible base rate. Also, constantly question the predictive value of your evidence. When you have doubts about the quality of the evidence, let your judgments of probabilities stay close to the base rate. The human mind is wired to discount the quality of the evidence, so this will require a deliberate system two effort, combining self-monitoring and control.

• Representativeness belongs to a cluster of closely related basic assessments that are likely to be generated together. The most representative outcomes create the most coherent stories, and the most coherent stories are not necessarily the most probable that they are the most plausible-seeming to System 1, confusing those who do not apply System 2 thinking to probability. Thus, for forecasting the future, while adding details to scenarios makes them more persuasive it makes them less likely to be an accurate description of the future.

• Stereotypes are statements about a group that are at least tentatively accepted as facts about every member. These statements are regularly interpreted as setting up a tendency in individual members of the group and they easily fit a causal story, making them powerful explanatory models that shape the way we think. These stereotypes emerge from system one thinking that inherently represents categories as norms, averages, and prototypical examples. When we think of a group, we do not inherently consider the variety of individuals another group, but hold a particular understanding of a “normal” member of each of these categories. Has a society, we have developed a social norm against stereotypes based on certain standards which have been beneficial in creating a more civilized and humane society, but does have a cost in resulting in suboptimal judgments.

• Problematically, studies show that when people, including college students, learn about new broadly-formulated, statistically-verified information, they generally failed to draw from this information inferences and conclusions that conflict with their current beliefs. For instance, when they learn about the surprising conclusions of studies, they tend to exclude mentally themselves and their friends and acquaintances from the conclusions of experiments that are not flattering about human beings. Consequently, they do not change in basic fashion the ways they understand social reality, especially when learning unflattering results.

• So, people learning accurate statistical information or broadly-formulated, general information often fail to update their beliefs. However, studiers show that people do learn somewhat more and are more open to changing their understanding when they are surprised by individual cases and stories. Thus, people are much more willing to infer the general from the particular than to deduce from the general to the particular. This relates to the basic way that System 1 functions, since it holds a stereotypical example and can much more effectively update that example based on new information relating to a specific case, then to statistical facts or broadly-stated ones.


Longer Summary (my personal thoughts on what is relevant for me are prefaced by "I should")

Introduction • In any sphere of knowledge, we need to acquire a large set of labels for concepts, each of which binds a specific term with all the elements associated with it. A deeper understanding of judgments, choices, and behaviors also requires a rich vocabulary, consisting of concepts such as System 1 and System 2, cognitive biases, etc., which make the recognition of how we judge, choose, and behave easier and more effective. The author aims to introduce into everyday conversations a better understanding of our errors in judgment, choice, and behavior. • As we navigate our lives, we normally allow ourselves be guided by impressions and feelings. The confidence we have in our intuitive beliefs and preferences is usually justified, but not always. People use simplifying heuristics, rules of thumb, to make difficult judgments, which causes predictable biases in their predictions. • Emotions play an important role in our judgments and choices and decisions, an example of what is called the affect heuristic, where judgments and decisions are guided directly by feelings of like and dislike, with little deliberation or reasoning. • Expert intuition develops when experts have learned to recognize familiar elements in a new situation and to act in a manner that is appropriate to it. If the individual has relevant expertise, the individual will recognize situation and the intuitive solution that comes to mind is more likely to be correct than for a nonexpert. • When the challenge is difficult and a skilled solution is not available, intuition still has a shot, and an answer may come to mind quickly – but it is not an answer to the original question. This is an essential element of intuitive heuristics: when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution, and sometimes end up with a mistaken answer to the question posed. • However, the spontaneous search for an intuitive solution sometimes fails. In such cases we often find ourselves switching to slower, more deliberate, and more effortful forms of thinking – slow thinking, System 2 thinking. The intuitive System 1 is much more influential in your thought processes than your experience tells you, and it is this secret author of many of the choices and judgments you make. • We easily think associatevely, metaphorically, and causally, but System 1 is bad at logical and statistical thinking. • We are prone to be overconfident in what we know and failed to knowledge the full extent of our ignorance and uncertainty about the world, as well as the role of chance events. • This makes our general impression of human beings as rational actors, especially as prevalent in economics but more broadly public discourse, here at odds with reality. • There is a distinction between two selves, the experiencing self and the remembering self. Thus, if we expose individuals to two painful episodes, we can manipulate their memories so that the episode which is more painful because it is longer is remembered as better. When people later decide what episode to repeat, they are guided by the remembering selves and expose their experiencing self to unnecessary pain. The same goes for happiness.

Chapter 1 • System 1 is fast, intuitive, and emotional; it operates automatically and quickly, with little or no effort and no sense of voluntary control. • System 2 is slower, more deliberative, and more logical. It allocates attention to the effortful mental activities that demand it. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration. • When we think of ourselves, we identify with System 2, the conscious, reasoning self that has beliefs, makes choices, and decides what to think about and what to do. Although System 2 believes itself to be where the action is, the automatic System 1 is the hero of the book. I describe System 1 as effortlessly originating impressions and feelings that are the main sources of the explicit beliefs and deliberate choices of System 2. The automatic operations of System 1 generate surprisingly complex patterns of ideas, but only the slower System 2 can construct thoughts in an orderly series of steps. I also describe circumstances in which System 2 takes over, overruling the freewheeling impulses and associations of System 1. You will be invited to think of the two systems as agents with their individual abilities, limitations, and functions. • In rough order of complexity, here are some examples of the automatic activities that are attributed to System 1: o • Detect that one object is more distant than another. • Orient to the source of a sudden sound. • Complete the phrase “bread and . . .” • Make a “disgust face” when shown a horrible picture. • Detect hostility in a voice. • Answer to 2 + 2 = ? • Read words on large billboards. • Drive a car on an empty road. • Find a strong move in chess (if you are a chess master). • Understand simple sentences. • Recognize that a “meek and tidy soul with a passion for detail” resembles an occupational stereotype. o All these mental events occur automatically and require little or no effort. Several of the mental actions in the list are completely involuntary. Other activities are susceptible to voluntary control but normally run on automatic pilot. The control of attention is shared by the two systems. Orienting to a loud sound is normally an involuntary operation of System 1, which immediately mobilizes the voluntary attention of System 2. However, attention can be moved away from an unwanted focus, primarily by focusing intently on another target. • The highly diverse operations of System 2 have one feature in common: they require attention and are disrupted when attention is drawn away. Here are some examples: o • Brace for the starter gun in a race. • Focus attention on the clowns in the circus. • Focus on the voice of a particular person in a crowded and noisy room. • Look for a woman with white hair. • Search memory to identify a surprising sound. • Maintain a faster walking speed than is natural for you. • Monitor the appropriateness of your behavior in a social situation. • Count the occurrences of the letter a in a page of text. • Tell someone your phone number. • Park in a narrow space (for most people except garage attendants). • Compare two washing machines for overall value. • Fill out a tax form. • Check the validity of a complex logical argument. o In all these situations you must pay attention, and you will perform less well, or not at all, if you are not ready or if your attention is directed inappropriately. System 2 has some ability to change the way System 1 works, by programming the normally automatic functions of attention and memory. • The often-used phrase “pay attention” is apt: you dispose of a limited budget of attention that you can allocate to activities, and if you try to go beyond your budget, you will fail. It is the mark of effortful activities that they interfere with each other, which is why it is difficult or impossible to conduct several at once. Intense focusing on a task can make people effectively blind, even to stimuli that normally attract attention. The gorilla video study illustrates two important facts about our minds: we can be blind to the obvious, and we are also blind to our blindness. • The interaction of the two systems is a recurrent theme of the book, and a brief synopsis of the plot is in order. In the story I will tell, Systems 1 and 2 are both active whenever we are awake. System 1 runs automatically and System 2 is normally in a comfortable low-effort mode, in which only a fraction of its capacity is engaged. System 1 continuously generates suggestions for System 2: impressions, intuitions, intentions, and feelings. If endorsed by System 2, impressions and intuitions turn into beliefs, and impulses turn into voluntary actions. When all goes smoothly, which is most of the time, System 2 adopts the suggestions of System 1 with little or no modification. You generally believe your impressions and act on your desires, and that is fine—usually. • When System 1 runs into difficulty, it calls on System 2 to support more detailed and specific processing that may solve the problem of the moment. System 2 is mobilized when a question arises for which System 1 does not offer an answer. You can also feel a surge of conscious attention whenever you are surprised. System 2 is activated when an event is detected that violates the model of the world that System 1 maintains. System 2 is also credited with the continuous monitoring of your own behavior—the control that keeps you polite when you are angry, and alert when you are driving at night. System 2 is mobilized to increased effort when it detects an error about to be made. In summary, most of what you (your System 2) think and do originates in your System 1, but System 2 takes over when things get difficult, and it normally has the last word. • The division of labor between System 1 and System 2 is highly efficient: it minimizes effort and optimizes performance. The arrangement works well most of the time because System 1 is generally very good at what it does: its models of familiar situations are accurate, its short-term predictions are usually accurate as well, and its initial reactions to challenges are swift and generally appropriate. System 1 has biases, however, systematic errors that it is prone to make in specified circumstances. As we shall see, it sometimes answers easier questions than the one it was asked, and it has little understanding of logic and statistics. One further limitation of System 1 is that it cannot be turned off. If you are shown a word on the screen in a language you know, you will read it—unless your attention is totally focused elsewhere. • One of the tasks of System 2 is to overcome the impulses of System 1. In other words, System 2 is in charge of self-control. • Because System 1 operates automatically and cannot be turned off at will, errors of intuitive thought are often difficult to prevent. Biases cannot always be avoided, because System 2 may have no clue to the error. Even when cues to likely errors are available, errors can be prevented only by the enhanced monitoring and effortful activity of System 2. As a way to live your life, however, continuous vigilance is not necessarily good, and it is certainly impractical. Constantly questioning our own thinking would be impossibly tedious, and System 2 is much too slow and inefficient to serve as a substitute for System 1 in making routine decisions. The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high. The premise of this book is that it is easier to recognize other people’s mistakes than our own. • System 1 and System 2 are so central to the story I tell in this book that I must make it absolutely clear that they are fictitious characters. Systems 1 and 2 are not systems in the standard sense of entities with interacting aspects or parts. And there is no one part of the brain that either of the systems would call home. You may well ask: What is the point of introducing fictitious characters with ugly names into a serious book? The answer is that the characters are useful because of some quirks of our minds, yours and mine. A sentence is understood more easily if it describes what an agent (System 2) does than if it describes what something is, what properties it has. In other words, “System 2” is a better subject for a sentence than “mental arithmetic.” The mind—especially System 1—appears to have a special aptitude for the construction and interpretation of stories about active agents, who have personalities, habits, and abilities. You quickly formed a bad opinion of the thieving butler, you expect more bad behavior from him, and you will remember him for a while. This is also my hope for the language of systems.

Chapter 2, Attention and Effort • The defining feature of System 2 is that its operations are effortful, and one of its main characteristics is laziness, a reluctance to invest more effort than appears strictly necessary. As a consequence, the thoughts and actions that System 2 believes it has chosen are often guided by System 1. However, there are vital tasks that only system two can perform because they require effort and self-control in which the intuitions and impulses of system one are overcome. • The arousal resulting from mental effort is different than emotional arousal, it has different cognitive and physical expression, with pupil dilation being an example of the latter. • The response of mental overload by system two is to protect the activity perceived as the most important, with spare capacity allocated to other tasks. • A general “law of least effort” applies to cognitive exertion, which is part of a broader pattern of human actions. The law asserts that if there are several ways of achieving the same goal, people will gravitate to the least demanding course of action – and this applies to cognitive effort as well as other forms of human activities. In the economy of mental action, effort is a cost, and human actions and motivations are driven by a balance of benefits and costs. Laziness is thus built in deep into our nature. • System 2 effort is required in a variety of activities, for example maintaining simultaneously in memory several ideas that require separate actions, or that need to be combined according to a specific rule. System two is the only one that can follow rules, compare objects on several attributes, and make deliberate choices between options. The automatic system one does not have these capabilities. System one is good at simple relations and is good at integrating information about one thing, but it does not deal well with multiple distinct topics at once nor is it adept at using statistics. • A key capability of system two years the ability to reprogram system one to override habitual responses. • A key discovery of cognitive psychologists in recent decades is that switching from one asked to another is effortful, especially under time pressure. Time pressure can be artificially imposed from without, but also from within, whether you desire to solve problem quickly or due to the holding of information in memory to solve a problem, since there is a desire to solve the problem and not focus on this information anymore. • We normally avoid mental overload by dividing our tasks into multiple easy steps, committing his immediate results too long-term memory workpaper rather than an easily overloaded working memory. We cover long distances by taking our time and conduct our mental lives by the law of least effort.

Chapter 3, The Lazy Controller • System two has a natural speed. Your expense and mental energy and random thoughts and in monitoring what goes on around you even when you are not focusing on anything in particular, but there is little strain. The more straining the task, the more difficult it is to undertake everyday activities and still perform system two tasks. For instance, it is generally quite possible to walk and think and talk the same time, but if a walking pace increases beyond what you are comfortable with the quality of thinking and talking inevitably deteriorates, because it takes effort to force yourself to deliberately walk faster than you are comfortable with. Self-control and deliberate thought draw on the same limited budget of mental effort. • Fortunately, mental work is not always so straining, and people sometimes spent considerable effort for long periods of time without having to exert extensive willpower, a state that has received the term flow. Flow separates two forms of effort, concentration on the task and the deliberate control of attention, with the state of flow involving the former without the effort of the latter. • Both self-control and cognitive effort are forms of mental work. People who are challenged simultaneously by demanding cognitive tasks and by temptation are more likely to yield to the temptation. Furthermore, people who are cognitively busy are more likely to make selfish choices, use sexist language, make superficial judgments in social situations, and generally be less thoughtful. • All variants of voluntary effort – cognitive, emotional, physical – draw at least partly on a shared pool of mental energy. Thus, experiments are shown that an effort of will or self-control is tiring. If you have to force yourself to do something, you are less willing or able to exert self-control when the next challenge comes around, a phenomenon called ego depletion. In other words, activities that impose high demands on System 2 requires self-control, and the exertion of self-control is depleting and unpleasant. • Unlike cognitive load, ego depletion is at least in part a result of the loss of motivation. After exerting self-control in one pass, you are less motivated to make an effort in another, although you could if you really had to: experiments have shown that people were able to resist the effects of ego depletion when given a strong incentive to do so. In contrast, increasing effort is not an option when you must perform straining mental tasks. In other words, ego depletion is not the same mental state as cognitive business. • Surprisingly, the idea of mental energy is more than a metaphor. The nervous system consumes more glucose than most other parts of the body, and effortful mental affinity appears to be especially expensive in the currency of glucose. Thus, experiments have shown that the effects of people depletion can be undone by eating glucose. • People frequently engage in lazy thinking, putting too much faith and confidence in their intuition, and avoiding use of System 2 thinking. They are defined cognitive effort somewhat unpleasant and avoided as much as possible. Furthermore, when people believe the conclusion is true, they are also very likely to believe arguments and evidence that appears to support it, even when these arguments and evidence are unsound. However, these tendencies vary among individuals, and can be changed as a result of deliberate education and effort, especially since they are in part dependent on motivation, of a lack of desire to try hard enough. Furthermore, people can solve difficult problems when they are not tempted to accept a superficially plausible but wrong answer that comes readily to mind. • Those who avoid intellectual laziness could be called more engaged, rational thinkers, being more alert, more intellectually active, less willing to be satisfied with superficially attractive answers, and are more skeptical about their intuitions. One psychologist, Keith Stanovich, draws a distinction between two parts of System 2. One deals with slow and effortful thinking and demanding computation. Some people are better than others in this task, they are the individuals work cell in intelligence tests and are able to switch quickly and efficiently from one task to another. Yet such high intelligence does not make people immune to biases. Another ability is involved, what that psychologist labelled rationality and what Kahneman terms being engaged, which is distinct from intelligence as such.

Chapter 4, The Associative Machine • Associative activation is a process in which ideas that have been you will trigger may need other ideas in a branding cascade of associative activity in your brain. The essential feature of this complex set of mental events is its coherence. Each element is connected to an supports and strengthens the others. For example, a negative emotion-laden word evokes memories, which evoke emotions, which in turn he will facial expressions and other physical reactions, such as a general tension and an avoidance tendency. The basic physically-expressed facial expression and the avoidance motion intensify their feelings to which they are linked, and the feelings reinforce compatible ideas. All of this happens quickly and nearly at once, resulting in a self-reinforcing pattern of cognitive, emotional, and physical responses that is both papers and integrate, which has been called associatively coherent. • Most of these activities happen below the level of consciousness, and are undertaken automatically by our system one; we know far less about our mental world than we think we do. • Cognition is embodied, meaning you think with your body and not only your brain. • Psychologists think of ideas as notes in a vast network called associative memory, in which each idea is linked to many others, and in turn linked to emotions and physical responses. There are different types of links between ideas: causes are linked to their effects, things to their properties, and things to the categories in which they belong. • Priming effect refers to how exposure to one idea results in an immediate and measurable change in how easy it is to undertake the cognitive activation of many other associated ideas. This is because of the associative process, where an activated idea prepares your mind to activate other ideas: these ideas, which are now cognitively easier to activate, are called prime ideas. Furthermore, these first-level primed ideas have some ability to prime related ideas, although more weekly. Thus, activation of an idea is like a rock thrown into a pond, with an expanding circle of increasingly weaker activation affects. • Priming impacts not only concepts, but also emotions and psychosomatic effects. Vice versa, emotions and psychosomatic effects prime ideas. For example, acting old would reinforce thoughts of old age and emotions associated with it. Smiling will generally reinforce positive emotions and thoughts. • Studies of priming effects show that we are powerfully impacted in our thinking, judgment, choices, and behavior by external factors of which we are not aware consciously, which challenges our self image as conscious and autonomous authors of our judgments and choices. Priming results from System 1 and you generally have no conscious access to priming effects. For example, priming with money thoughts will lead inevitably to a reluctance to be involved with others, to depend on others, to help others, or to accept demands from others. • To some extent, priming can be fought with if we are aware of the nature of priming effects and its impact on our mental life, as well as pay attention to priming effects in our environment. • System 1 provides the impressions that often turn into your beliefs, and it is the source of the impulses that often become your choices and actions. It is an associative, meaning-making machine, which offers a tacit interpretation of what happens to you and around you, linking the present with the recent past and with expectations about the near future. It contains the model of the world that instantly evaluates events as normal or surprising. It is the source of your rapid intuitive judgments, which are often precise but also have systematic errors, and it does most of this without your conscious awareness of its activities.

Chapter 5, Cognitive Ease • Cognitive ease determines a great deal about our mental life. Generally, cognitive ease is assigned that things are going smoothly and there are no problems, threats, major news, and no need to redirect attention or mobilize effort. Strain indicates that a potential problem exists, which will require increased mobilization of effort and attention, thus System 2 thinking. Our minds are wired in such a way as to minimize cognitive strain and thus avoid activation of System 2, making it in effect a lazy controller. • People generally prefer cognitive ease and thus to make a stronger impact in having people remember and believe your message, you should strive to minimize cognitive strain as part of your message. For example, a message that is printed in a clear font, with clear colors, in clear and simple language, in a rhyming verse, and with easily-understandable references, will make a bigger impact because it will be cognitively easier for people to process. Beside, a message that has been frequently repeated, has been primed, or is heard by people in a good mood makes a stronger impact because it will be cognitively easier for people to process. Cognitively easier messages will feel more true, familiar, good, and effortless, and thus make a stronger impact. • The various causes of ease or strain have interchangeable effects. When you are in a state of cognitive ease, it tends to be in a good mood, like what you see, belief with you learn, trust your intuitions and instincts, and feel that the current situation is comfortable and familiar, and also be relatively casual and superficial in your thing. When you are strain, you are more likely to be vigilant and suspicious, invest more effort in what you’re doing, feel less comfortable, make fewer errors, but also be less intuitive, less creative, and in a worse mood. • Predictable systematic mistakes inevitably occur if a judgment is based on an impression of cognitive ease or strain. Anything that makes it easier for the associative machine to run smoothly will also bias beliefs. A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth, as propagandists and marketers have discovered. • Likewise, you do not have to repeat an entire statement to make it appear true. The familiarity of one phrase is sufficient to make the whole statement feel familiar, and thus cognitively easier to accept. • To figure out whether a statement is true or just has the illusion of truth due to cognitive ease is challenging but possible, requiring you to apply System 2 thinking to examine whether the statement is linked strongly by logic to other believes or preferences you hold, or comes from a source that you trust. • The experience of cognitive ease is characteristic of system one thinking while cognitive strain results in the (reluctant) mobilization of system two thinking. For example, people make less mistakes when messages are harder to read by being printed in a smaller and less clear font, because they engage more in system two thinking to process the content. • Mere exposure effect refers to the fact that repetition results in cognitive ease, inducing a comfortable sense of familiarity and thus appears true. Mere exposure functions even when the person is not aware of being exposed to a stimulus, showing that it happens within system one. • Positive mood results in better performance on creative and intuitive activities. • System one is associated with good mood, intuition, creativity, and gullibility. System two is associated with sadness, vigilance, suspicion, an analytic approach, and increased effort.

Chapter 6, Norms, Surprises, and Causes • Violations of normality are detected with rapid speed and subtlety. • We have norms for a vast number of categories, and these norms provide the background for the immediate detection of anomalies. • System 1 is adept at finding a coherent causal story that links the fragments of knowledge at its disposal, sometimes incorrectly so. The automatic search for causes shapes our thinking and powerful and sometimes problematic ways. • Human brains are wired to have impressions of cause and effect, which do not and on reasoning about patterns of cause and effect but are a consequence of system one thinking about norms and causes, resulting in impressions of causality rather than analytic understandings of causality based on logical argumentation. • Your mind is ready and even eager to identify agents, assigned and personality traits specific intentions, and even their actions as expressing individual propensities. This is why stories make such strong impressions on so many people. • People are wrong for five causal thinking and appropriate, the situations that require statistical reasoning. Statistical thinking that I’ve conclusions about individual cases from properties of categories and ensembles. Unfortunately system wonders not have the capacity for this model reason, and while system two can learn to think statistically, you people received the necessary training or undertake the necessary effort.

Chapter 7, A Machine for Jumping to Conclusions • Jumping to conclusions is efficient if the conclusions are likely to be correct and the costs of an occasional mistake acceptable and if they jump much time and effort. Jumping to conclusions is risky when the situation is familiar, the stakes are high, and there is no time to collect more information. These circumstances in which intuitive errors are most probable and costly, and suggest the need for deliberate intervention by System 2. • In the process of jumping to conclusions, often System 1 makes a definitive choice and you are only aware of the option presented by System 1, without being aware of the original ambiguity possibility for multiple interpretations. System 1 does not keep track of alternatives that it rejects or even of the fact that there were alternatives. Conscious doubt is not part of system 1 since it requires maintaining incompatible interpretations in mind at the same time. Uncertainty and doubt are the domain of System 2. • System one is gullible and biased to believe, system two is in charge of doubting and believing. If system two is otherwise engaged, or due to the inherent laziness of system two, we are much more gullible and biased to believe messages. • Confirmation bias refers to how we seek data that are likely to be compatible with the beliefs we currently hold, and avoid or reject information that does not conform to our beliefs. • Halo effect refers to the tendency that if you like one thing about a person or concept, you will be more likely to like other qualities about a person or concept, accepting evidence and arguments that indicate positive things about their personal concept and rejecting ones that do not. • Decorrelating error refers to the need to get a large sample size or diverse inputs to reduce the likelihood of and minimize errors. • Our associative machine represents only activated ideas. Information that is not retrieved from memory, consciously or unconsciously, might as well not exist. System one excels at constructing the best possible story that incorporates ideas currently activated, but it does not allow for information it does not have. The measure of success system one is the coherence of the story it manages to create. The amount and quality of the data on which the story is based are largely irrelevant to system one. This is why stories make such powerful impacts. • When information is scarce, system one operates as a machine for jumping to conclusions. The combination of a coherence-seeking system one with a lazy system two implies that system two will endorse many intuitive beliefs based on impressions generated by system one. • The author created an abbreviation for jumping to conclusions on the basis of limited evidence: WYSIATI, what you see is all there is. WYSIATI explains along and diverse list of biases, including: o Overconfidence: in that neither the quantity nor the quality of the evidence counts for much in subjective confidence that results from system one thinking. The overconfidence that individuals have depends mostly on the quality and coherence of the story they can tell about what they observe, as human minds do not naturally consider the possibility that there is key evidence that we might be missing. o Framing effects: different ways of presenting information result in different emotions and associative chains, resulting in improper conclusions. o Base-rate neglect: system one tends to focus on new evidence presented to it, and forgets about the prior rate of occurrence.

Chapter 8, How Judgments Happen • system two receives questions for generates them, directing attention and searching memory to find the answers. System one functions differently: it continuously monitors what is going on outside and inside the mind, and continuously generates assessments of various aspects of the situation without specific intention and with little or no effort. These basic assessments play an important role in intuitive judgment, because they are easily substituted for more difficult questions – this is the essential idea of the heuristics and biases approach. • System one represents categories by a prototype for a set of difficult examples or an average, but it deals poorly with sums and with overall estimation of quantities in categories. Thus, emotional and intuitive reactions are often poorly correlated with numerical quantities and statistics, but better correlated with averages and comparisons. • System 1 easily makes comparisons and judges/matches intensity, although it makes systematic errors in statistical and computational reasoning that often make these judgments problematic. • Mental shotgun refers to how we make automatic computations that are much more than we actually need to solve a question.

Chapter 9, Answering an Easier Question • Substition refers to the operation of system one of substituting an easier and related question for a harder question that requires cognitive effort to answer. System one processes, such as the mental shotgun and intensity matching along with others, often make available answers the easy questions that could be mapped onto the harder target questions. On some occasions, substitution will occur and a heuristic answer will be endorsed by system two and form the basis for judgment, belief, and action. While system to have the opportunity to reject this execution, the cognitive effort required will lead many people to engage in lazy thinking and not scrutinize this substitution, with many people not even noticing that I substitution occurred and perhaps not even realizing that the target question was difficult, because an intuitive answer to it came readily to mind. • Affect heuristic refers to how people let their likes and dislikes, their attitudes, determine their beliefs about reality and the accuracy of information and arguments. System two functions in many cases as more of an apologist and justifier for the emotional and intuitive impressions of System 1 than a critic of System 1 thinking.

Part 1 Epilogue • General characteristics of System 1: o generates impressions, feelings, and inclinations  when endorsed by system two, these become attitudes, intentions, and voluntary actions o operates automatically quickly, with no sense of effort or voluntary control o can be reprogrammed by system two with effort o after adequate training to gain expertise, it can execute skilled responses and intuitions o create a coherent pattern of activated ideas in associative memory o links a sense of cognitive ease to illusions optional, pleasant feelings, and reduced vigilance o distinguishes the surprising from the normal o infers and invents causes and intentions o neglects ambiguity and suppresses doubt o is biased to believe and confirm o exaggerates emotional consistency, resulting in the hell of effect o focuses on existing evidence and ignores facts and evidence: what you see is all there is, WYSIATI o generates a limited set of basic assessments o represent sets by norms and prototypes, does not integrate or engage well in computational or statistical thinking o neglects base rate and focuses on new evidence – base-rate neglect o matches intensities across scales and categories – intensity matching o computes more than intended – mental shotgun o can substitute an easier question for a difficult one – heuristics o is more sensitive to changes then to existing states – prospect theory o overweighs low probabilities and underestimates impact of small sample size and benefit of decorrelating errors o shows diminishing sensitivity to quantity – psychophysics o response more strongly to losses into games – loss aversion o we frame decision problems narrowly, in isolation from one another

Chapter 10, The Law of Small Numbers • Small samples yield extreme results more often than large samples to. Therefore large samples are more precise than small samples. Yet even if we are aware of this, our System 1 thinking is inherently not computational and statistical, but coherence-seeking and story-oriented, resulting in even small sample sizes being used as adequate explanations that cause us to believe the content of messages. Unless the message is immediately negated in our minds, the associations that evoke will spread in the mind as if the message was true. System two is capable of doubt and uncertainty, but sustaining doubt is harder mental work and sliding into certainty, which is characteristic of system one thinking. • The strong bias toward believing that small samples closely resembled the population from which they are John is a part of a larger pattern of our minds being thrown to exaggerating the consistency and coherence of what we see. System one inevitably runs ahead of the facts to construct a rich image and cohesive story on the basis of inadequate scraps of evidence, jumping to conclusions and producing a representation of reality that makes too much sense. • The associative machinery of system 1 seeks causes. However the difficulty we have with statistical irregularities is that they call for a different approach, informed by statistical and computational and chance-oriented thinking. It is very challenging for system one to acknowledge that nothing in particular caused something to be what it is and that chance selected it from among a variety of alternatives: system one engages in pattern seeking and coherence making, and this predilection for causal, story, and pattern-oriented thinking exposes us to serious mistakes in evaluating random events. System one does not expect to see irregularities but used by a random process, despite the statistical likelihood of such occurring. As a result when we detect what appears to be a pattern, system one quickly rejects the idea that the process is truly random, and searches for a causal explanation. We are far too willing to reject the belief that much of what we see in life is random. • The exaggerated faith in small samples is one example of a more general systematic error, namely that we pay more attention to the content of messages then to information about their reliability and as a result and of without you of the world that is simpler and more coherent than the data justify. • Statistics and use many observations that appeared debate for causal explanations but are actually not caused by Amy specific factor, pattern, or rule. Many backs of the world argued the chance, including accidents of sampling. Causal explanations of such events are often wrong.

Chapter 11, Anchors • Anchoring effect occurs when people consider a particular value for an unknown quantity before estimating that quantity, which results in the estimate staying close to the number that people consider. Any number that you are asked to consider as a possible solution for estimation problem will induce an anchoring effect. Anchoring functions through two mechanisms. o One is an adjust-and-anchor heuristic as a strategy for estimating uncertain quantity where you start from a given number, assess whether it is too high or too low, and gradually adjust your estimate by mentally moving away from the anchor. The adjustment typically ends prematurely, because people stopped when they are no longer certain that they should move further. This implies that we should move further than we thought we should when dealing with an anchoring situation, and insufficient adjustment is a failure of a lazy System 2. o Anchoring also occurs as a priming effect, based on suggestion. System one understands ideas by trying to make them true and assessing them in that way, and even if the idea is rejected as an show, compatible ideas are activated that subsequently influence our estimate of what is the reality. o Fighting with anchoring requires a deliberate activation of system two to make an immediate argument that rejects the anchor. You should assume that any number that is originally presented as an anchoring effect on you, whether you are aware of it or not, and if you decided worthwhile you need to activate system to struggle with this anchor.

Chapter 12, The Science of Availability • Availability heuristic is the process of judging frequency and significance by the ease with which instances come to mind. That availability heuristic substitutes and easier question for another: you wish to estimate the size of a category or the frequency of an event, but you report an impression of the ease with which instances come to mind. o A salient event that attracts your attention will be easily retrieved from memory. o A dramatic event temporarily increases the availability of its category. o Personal experiences, images, and vivid examples are more available than incidents that happen to others, words, or statistics. This is why stories, images, and clear examples are good tools of communication. • Resisting this large collection of potential availability biases is effortful, but in many cases important for success. • Awareness of availability and other biases can contribute to success in joint projects and relationships and team dynamics. One bias that is a problem in team dynamics is that many members of a collaborative team tend to feel that they have done more than their share and feel that the others are not adequately grateful for their individual contributions. Solving this requires thinking others acknowledging them more than you think is natural and appropriate. Also remember that many team members are likely to feel that they do more than their fair share: most tends to feel about average. • Above average bias: most people tend to feel above average. • Ratings are dominated by the ease with which examples come to mind, and the experience of fluent retrieval of instances is more important than the number retrieved for assessment. In other words, people who are asked to retrieve more instances of something and find it increasingly hard to do so will end up being less confident in their actual judgment of the rating, despite retrieving more instances. • A professor found a great way to exploit the availability bias. Yes different groups of students to list ways to improve the course and he varied the required number of improvements. As expected, the students who were asked to list more ways to improve the class rated the class higher.

Chapter 13, Availability, Emotion, and Risk • The world in our heads is not a precise replica of reality, since our expectations about the frequency of events are distorted by the prevalence and emotional intensity of the messages to which we are expose, especially through the media: availability heuristic. For example, when assessing risks of death or ill health, people generally focus on the spectacular, which is most available in our memory and thus comes most easily to our consciousness. • The affect heuristic is an instance of substitution, in which an answer to an easy question – how do I feel about it – serves as an answer to a much harder question – how should I think about it logically? The affect heuristic causes people to link the benefits and the risks of something and resume the those things with more benefits have less risks and vice versa, despite that generally not being the way the world works. Consistent affect is essential element of associative coherence and a simplified view of the world that is much simpler and tidier than actual reality, characteristic of system one thinking. • People tend to be guided by emotion rather than reason, easily swayed by trivial details, and inadequately sensitive to differences between low and negligibly low probabilities. Experts show many similar biases to the rest of us, but in weaker form and also are influenced by their professional disciplinary distortions. • The evaluation of risk depends in the choice of a measure, with the possibility that the choice may be guided by the preference for one outcome for another. • The availability cascade is a self-sustaining chain of events in which biases flow into public policy: often they start from media reports over relatively minor event that gets overblown and leads to public panic and resultant government action. • Probability neglect refers to how we either tend to ignore small risks altogether or give them far too much weight, distorting our judgments.

Chapter 14, Tom W’s Specialty • Representativeness refers to the similarity of a description to our typical stereotype image. • When asked to calculate probability, people tend to substitute a judgment of representativeness instead of considering other valuable information, most notably base rates. They also tends to ignore the quality of evidence in making probability assessments. • While in many cases, judging by representative stereotypes is valid and will give a solid estimate, in other situations stereotypes are false or estimates will run into trouble due to base rate neglect. • When you have doubts about the quality of the evidence, let your judgments of probabilities stay close to the base rate. The human mind is wired to discount the quality of the evidence, so this will require a deliberate system two effort, combining self-monitoring and control. • Generally, anchor your judgment of the probability of an outcome on a plausible base rate. Also, constantly question the predictive value of your evidence.

Chapter 15, Linda: Less is More • “Less is More”: due to representativeness and stereotyping, it seems to System 1 that the more detailed and specific a description is, the likelier it is to be true. However, in actuality the statistical probability is that when you specify a possible event in greater detail you can only lower its probability of being true. The less detailed something is, the more likely this to be true. o This suggests to me the needs to remember always the rules of logic and probability • Conjunctural fallacy: when people judge a conjunction of two events to be more probable than one of the events by itself. o This suggests to me the need to always be wary when tempted to judge to events as likelier than one of those events individually • Representativeness belongs to a cluster of closely related basic assessments that are likely to be generated together. The most representative outcomes create the most coherent stories, and the most coherent stories are not necessarily the most probable that they are the most plausible-seeming to System 1, confusing those who do not apply System 2 thinking to probability. Thus, for forecasting the future, while adding details to scenarios makes them more persuasive it makes them less likely to be an accurate description of the future. o This suggests to me the importance of not evaluating things based on how coherent they seem, but based on logical rules of how probable they are, when I am trying to establish truth value and a realistic view of the world. o On a separate note, this suggests to me that if I want to make a broad impact on a large majority of people who are not very good at System 2 thinking, I need to create plausible-seeming, coherent, and detailed stories, as opposed to giving statistical facts or rhetorical argumentation. I need to carefully judge and evaluate my audiences in my rhetoric when doing so. • Sets are represented by norms and prototypes, and we tend to judge these sets by average assessments, as opposed to by evaluating individual elements within that set. System 1 averages instead of adding or disaggregating. Consequently, we may undervalue or overvalue individual elements within that set based on our stereotypical presumptions of that set. o This suggests to me that the need to carefully evaluate groupings within a set, and even if relevant individual elements within a set, as opposed to evaluating the set as a whole.

Chapter 16, Causes Trump Statistics • Statistical base rates are facts about a population to which the case belongs, but they are not of seeming direct relevance to their individual case at hand. Statistical base rates are generally underweighed by system one thinking and often ignored altogether, especially when specific information about the case at hand is available. This causes people to make mistaken judgments. o This suggests to me the particular importance of assessing priors when dealing with statistical base rates, as opposed to causal base rates. • Causal base rates are base rates that have a clear and direct relevance for the case at hand. They are treated as information about the individual case and are easily combined with other case-specific information. This helps explain why stories that have a cohesion and present causes are easily absorbed and believed to represent reality by audiences, while statistical facts and argumentation do not make such a strong impac. • Stereotypes are statements about a group that are at least tentatively accepted as facts about every member. These statements are regularly interpreted as setting up a tendency in individual members of the group and they easily fit a causal story, making them powerful explanatory models that shape the way we think. These stereotypes emerge from system one thinking that inherently represents categories as norms, averages, and prototypical examples. When we think of a group, we do not inherently consider the variety of individuals another group, but hold a particular understanding of a “normal” member of each of these categories. Has a society, we have developed a social norm against stereotypes based on certain standards which have been beneficial in creating a more civilized and humane society, but does have a cost in resulting in suboptimal judgments. o This suggests to me that needs to always recognize that I and others think in terms of norms and exemplars, and therefore stereotypes. We do not naturally think of the diversity within each group and expect individuals within each group to behave in a fashion that conforms to our understanding of data group’s norm. This inevitably leads to occasional mistakes in judgment. • Problematically, studies show that when people, including college students, learn about new broadly-formulated, statistically-verified information, they generally failed to draw from this information inferences and conclusions that conflict with their current beliefs. For instance, when they learn about the surprising conclusions of studies, they tend to exclude mentally themselves and their friends and acquaintances from the conclusions of experiments that are not flattering about human beings. Consequently, they do not change in basic fashion the ways they understand social reality, especially when learning unflattering results. o This suggests to me the importance of updating my beliefs about myself and my friends and my acquaintances and everyone else based on the results of statistical studies, even those that are unflattering. • So, people learning accurate statistical information or broadly-formulated, general information often fail to update their beliefs. However, studiers show that people do learn somewhat more and are more open to changing their understanding when they are surprised by individual cases and stories. Thus, people are much more willing to infer the general from the particular than to deduce from the general to the particular. This relates to the basic way that System 1 functions, since it holds a stereotypical example and can much more effectively update that example based on new information relating to a specific case, then to statistical facts or broadly-stated ones. o This once more confirms to me the importance of telling stories that update people’s beliefs about a particular stereotype they hold of a specific social group or situation. Moreover, it underlines the importance of telling individual stories and surprising people and getting them to think about their own behavior and thinking as a means of making an impact. • Normal and decent people do not rush to help when they think that others can take on the challenging and unpleasant task of helping, or in general doing the responsible thing. o This suggests to me the necessity for me to take greater responsibility even when I think other people can do so, because everyone expects other people to take on the responsibility and then generally things don’t get done. It also suggests to me they need to encourage others to take responsibility as a means of getting things done. • Mild social pressure has a surprisingly strong impact on the way people act, indicating the surprising power of social settings. o This suggests to me the necessity of evaluating my social setting and considering how it is making me behave in ways I would not otherwise, and then consciously deciding whether that is the way that I actually want to behave. Moreover, it suggests to me the benefit of manipulating social settings if I want to bring about a specific behavior in other people.


Chapter 17, Regression to the Mean • Our individual performance in tasks is a combination of our skill and of luck (chance): of course, different tasks are somewhat different. So, an unusually successful performance on any task should be seen not as a sign of inherent outstanding skill, but as any combination of skill and luck. In other words, somebody could have an average or even below average level of skill but a very lucky day. Or somebody can have an above average level of skill and a somewhat lucky day. Or somebody could have an outstandingly high skill and an average day. The only way to judge skill appropriately is to examine the situation over a long time period. The same applies to a poor performance on a task. Even somebody with a high level of skill could underperform on a really off day. Or somebody with an average level of skill could underperform on somewhat off day. Or it could be somebody with a below average level of skill having a regular day. o This should remind me that I should not judge someone simply by their one-time performance on a task, especially if this is a challenging, intense, and brief performance, as luck is more likely to play a role there. It is necessary to have a series of long-term, repeated exposures to someone in order to evaluate this adequately. • An implication of this understanding is something called “regression to the mean.” This means that unusually high-level or low-level performances will tend to be closer to the average on the next performance, as opposed to even higher or even lower. This is because skill combines with luck, and luck does not hold over time. Thus, regression to the mean will occur regardless of whether there is an intervention to reinforce positively high-level performance or negatively low-level performance. o This should remind me that a high-level, very successful performance on a task will tend to regress to the mean, as will a low-level performance on a task, regardless of reinforcement. Reinforcement will contribute only over the long-term in encouraging someone to strengthen their commitment to improving their skills, and positive reinforcements work much better than negative ones. • This creates some pernicious consequences. First, we know that an important principle of training is that rewards for improved performance work overall significantly better than punishments for mistakes. Yet, rewarding an unusually successful performance will be generally correlated with a less successful performance next time, due to the tendency to regress to the mean. This does not mean that the reward did not work, and that the individual who was rewarded did not try hard to do what ever was rewarding, and did not build up her or his skill: it just means that chance did not help that individual out this time. Remember, correlation is not causation: if positive reinforcement did not result in immediate improved performance, this will most likely simply indicate regression to the mean, as opposed to positive reinforcement not working. o Although to me it will seem like positive reinforcement is followed by a worse performance, it is important to remember that the impact of positive reinforcement is felt over long-term stretches of time, while immediate surprisingly high performance will tend to regress to the mean. • Similarly, punishing an unusually bad performance generally correlates with an improved performance next time. Yet correlation is not causation: regardless of whether the punishment was applied, the person would likely have done better in repeating the task next time. Thus, punishing someone for a really poor performance and seeing an improved performance does not indicate that the punishment worked. It is better to credit the results of multiple studies that show that punishment does not work very well, not nearly as well as rewards for improved performance in training. Remember, correlation is not causation: if negative reinforcement resulted in immediate improved performance, this will most likely simply indicate regression to the mean, as opposed to negative reinforcement working well. o Although to me it will seem like negative reinforcement of an unusually poor performance is followed by an improved performance, it is important to remember that people’s performance would have most likely improved regardless of the negative reinforcement, due to the tendency to regress to the mean. Studies show that negative reinforcement is not that effective and can sometimes actually have countervailing impact, so minimize the use of negative reinforcement. • Thus, life exposes us to perverse feedback. We tend to be nice to other people when they please us and nasty when they do not. This is generally followed by people being less pleasing to us after rewards and more pleasing to us after punishments. Since System 1 is not cognizant of statistical thinking and regression to the mean, it appears to us that rewards do not work well while punishments do, and that it is better to be nasty than nice in order to get what we want. However, in actual reality the situation does not work that way, but we are misled by the way we think into believing that it does. o This should encourage me to use more positive reinforcement than I currently do and tamp down my use of negative reinforcement. It should also encourage me to encourage others to do the same. It should also remind me that other people may well learn the wrong lessons from their experiences and avoid using the appropriate strategies of positive reinforcement, and that should prepare me to deal with those situations because I recognize them. So, remember to manage my interactions with others with this knowledge in mind. • More broadly, remember that whenever the correlation between two scores is imperfect, there will tend to be be regression to the mean. This is a statistical inevitability. Yet our minds are structured in such a way that will always be tempted to search for causal explanations for such statistical tendencies, even though there is no inherent causal explanation. For example, the statement that highly-intelligent women tend to marry less intelligent husband’s will lead people to try to explain this possibly, whereas this is a statistical issue that does not have inherent causal explanation outside of statistics themselves. o For me, it will be crucial in this case and in all other cases to remember to consider priors, meaning prior statistical probabilities, and to also remember regression to the mean as a key explanatory factor, one that does not have inherent environmental costs but simply a statistical cause. • When our attention is called to an event, associative memory will look for its cause, and will automatically try to explain it by any cause that is already stored in memory. Causal explanations will be balked when the regression to the mean is detected, but they will be around because the truth is a regression to the mean has an explanation but does not have an inherent cause. Therefore, learning about regression to the mean and having it stick in your mind as a prominent explanation will help people deal with this problematic tendency of looking for causal explanations when this is just a statistical pattern. This also shows why it is important to have studies that compare groups of unusually high-performing and low-performing subjects, with one being a placebo and the other being given an interrevention. The subjects will tend to regress to the mean by themselves, and the key is to find out whether the placebo or intervention that is being tested actually changed the pattern. o For me it will be crucial to remember to encourage myself and others to engage in statistical thinking in regard to regression to the mean and outliers. Moreover, this should encourage me to check the causal explanations of others for ways of understanding that relate to regression to the mean. Moreover, it should make me value less than they I previously did the explanations of others who say something worked for them to make them feel better, as it can just be a tendency to regress to the mean.

Chapter 18, Taming Intuitive Predictions • Intuitive predictions are not very sensitive to the quality of the evidence. As a result of this, we tend to overestimate when making intuitive predictions, because we too often make these predictions on the basis of insufficiently strong evidence. Problematic cognitive tendencies such as intensity matching and substitution result in poor intuitive predictions. Moreover, we tend to put too much trust in our intuitive predictions. o I need to remember to not trust my intuitive predictions, which will tend to be too extreme and in which I will tend to have too much confidence. • This is especially in the case when prediction of the future is not distinguished from an affiliation of current evidence. When predicting the future, you should always make sure to differentiate current evidence from its potential to indicate future events. When predicting the future, you need to factor in uncertainty as a key element. Any uncertainty should make you regress somewhat to the mean when making your production. o Avoid relying on my intuitive predictions about the future based on current evidence, because our thinking too often tends to make predictions on the basis of an sufficiently strong evidence and generally overestimates the situation. I need to apply System 2 to evaluate whether my intuitive predictions make sense after factoring in the predictive value of current evidence for future events. Namely, I need to remember to regress to the mean if there is any uncertainty, with the regression to the mean being directly correlated to the degree of uncertainty. • One strategy for predicting the future is to do the following. Estimate the shared factors between the evidence that you have now and any potential future state that you are trying to predict. That will help you determine how predictive your evidence is for the value of the future state.

Chapter 19, The Illusion of Understanding • Narrative fallacy refers to how flawed stories of the past shape our views of the world and their expectations for the future. Narrative fallacies arise inevitably from our continuous attempt to make sense of the world. The explanatory stories that people find compelling are: simple and coherent, with clear heroes and no inconsistencies; they are concrete rather than abstract; due to attribution bias and other factors, they assign a larger role to talent, intelligence, and intentions than to luck; due to attentional bias and other factors, they focus on a few striking events that happened rather than the many other events that were less striking that occurred or the countless events that failed to happen. At work here is the powerful WYSIATI rule, as you cannot help dealing with the limited information you have as if it were all there is to know, with the human mind building the best possible story from the information available to it, and if it is a compelling story, people believe it. Paradoxically, it is easier to construct a coherent story when you know little, when there are fewer pieces to fit into the puzzle. The comforting conviction that the world makes sense rests on a secure foundation, namely the almost unlimited human ability to ignore our ignorance. • These stories are comforting because they provide an illusion of understanding. However, they actually reduce the complex, contradictory, and inconsistent world to a simple, coherent, consistent one, which helps ease our thoughts and feelings by providing a clear and compelling narrative, and fostering an illusion of inevitability. However, these narrative fallacies result in an unrealistic and simplified view of the world. They do not provide an accurate assessment of the past, do not describe what reality is like, and do not enable you to predict the future well. o I need to think about the role of narrative fallacy in my life and my understanding of society and of myself. Remember that I, like other people, am prone to falling for narrative fallacies. o Moreover, I need to recognize the role of narrative fallacies in the lives of other people, both in how they view themselves, and the world around them, especially in regard to my profession of the past but also the present and the future. People are likely to build and hold simplified narratives that are coherent and unified, but are not actually accurately reflecting the world as it exists, since an accurate assessment of the world would be very complex, demand more information than they possess, acknowledging consistencies and complexities, etc. I need to consider what that implies for my historical analysis and also my understanding of society today. o Finally, I need to consider how to take advantage of narrative fallacies in my own practices. If this is an inherent condition of the human brain as it is, I can certainly choose to fight it, but I can also consider taking advantage of it in some cases in order to help me achieve my goals. • Hindsight bias, or the “I-knew-it-all-along,” refers to people’s failure to recognize how their beliefs about a subject changed over time. When an unpredicted event occurs, we immediately adjust our view of the world to accommodate the surprise. Learning from surprises is a reasonable thing to do, but it can have some dangerous consequences. The human brain is poorly set up to reconstruct past states of its own knowledge, especially regarding beliefs that have changed: people instead believe that what they believed now is what they always believed, an instance of substitution. o I need to watch out for hindsight bias in myself and other people, and consider how it impacts my life and those of others. • Outcome bias refers to observers judging the quality of a decision not to buy whether the process of decision-making a sound but by whether the outcome was good or bad, regardless of the role of luck in the actual outcome. This outcome bias makes it very difficult to value decisions properly, namely in terms of the beliefs that were reasonable at the time the decision was made. Because adherence to standard operating procedures is difficult to second-guess, decision-makers who expect to have their decision scrutinized with hindsight are driven to bureaucratic solutions and an extreme reluctance to take risks. At the same time, hindsight and outcome bias can also bring undeserved rewards to the responsible risk seekers who took crazy gambles and one. Those who are lucky I never punished for having taken too much risks, and are instead seen as bold and having flair and foresight and prescience, creating a halo effect coloring their future assessments and actions. o I need to consider the impact of outcome bias on myself and another people, and pay more attention to evaluating decisions based less on outcomes than on reasonable assumptions at the time the decision was made in order to estimate how sound the decision was. Of course, outcomes matter great, but so does the role of luck, and those who happen to be lucky should not be overpraised. This is especially important for me as an observer and commentator and analyst of society, both past and present. It should also inform me when I think about how others understand themselves and their society, in the past and present the light. • The sense-making machinery of System 1 makes us see the world as more tidy, simple, predictable, and coherent than it really is. The illusion that one has understood the past feeds to further illusion that one can predict and control the future. These solutions are comforting and the reduced anxiety that we would experience if we allowed ourselves to knowledge fully the uncertainties of existence and the complexity of the world. This is exemplified by case studies of successful corporate leaders that are published in many journals and books. Stories of success and failure in business actually consistently exaggerated the impact of leadership style and management practice on firm financial outcomes, and thus their message is rarely nearly as useful as it is presented. Because luck plays such a large role in determining the success of companies and other organizations, as well as other internal and external factors, the quality of leadership and management practices cannot be inferred reliably from observations of success. o I should remember that System 1 makes us see the world as tidier, simpler, more predictable, and coherent it is, creating illusions of understanding the world and thus control and prediction of the future of the world. This illusion of understanding functions to reduce our anxiety over the complexities and ambiguities of the world, while in actual reality not giving us an accurate view of the world as it was, is, and will be.


Chapter 20, The Illusion of Validity • The subjective confidence we have in our opinions reflects the coherence of an error, while the amount and quality of evidence do not count for much, because poor evidence can make a very good story that is easy to understand and process. For some of our most important beliefs we have no evidence of told, except that people we love and trust of these beliefs. Subjective confidence that judgment is not the recent evaluation of the actual probability that this judgment is correct. Confidence is a feeling, which reflects the coherence of the information in the cognitive ease of processing it. It is wise to take admissions of uncertainty seriously, but the corporations of high confidence mainly tell you that an individual has constructed a inherent story in her mind, not necessarily that the story is true. To figure out to what extent the stories true, explore the amount and quality of the evidence on which this individual’s beliefs are based. o This should remind me that I need to avoid trusting narratives and beliefs that appear to be internally coherent and easy to process cognitively. Instead, I need to think about the amount and quality of actual evidence supporting these narratives and beliefs when I make a determination about how likely these narratives and beliefs are to reflect reality. Furthermore, I need to remember that other people are highly likely to find credible narratives and beliefs that are internally coherent, despite their possibly being the lack of good and copious evidence for these beliefs and narratives. I need to remember that subjective confidence of judgment is often based on the coherence of stories and beliefs, as opposed to quality and amount of evidence. • The illusion of skill refers to the perception by experts that they have a specific skill in their area of expertise, and the author uses stock traders as an example. The diagnostic for the existence of any skill is the consistency of individual differences in achievement. For instance, the year-to-year correlation between the outcomes of mutual funds is very small, barely higher than zero. Nearly all stock pickers, whether they know it or not, are playing game of chance. The subjective experience of traders is that they are making sensible and educated guesses in a situation of great uncertainty. In highly efficient markets, however, educated guesses are no more accurate than blind guesses. The illusion of skill is not the only individual aberration, but it is deeply ingrained in the culture of industry. Back to challenge such basic assumptions are simply not absorb, particularly in the case of statistical studies of performance, which provide base rate information that people generally ignore when it clashes with their personal impressions from experience more broadly, people can maintain an unshakable faith in any proposition, however absurd, when they are sustained by a community of like-minded believers. o I should remember this problem of illusion of skill by experts, not only in stock picking but in other areas of life as well. I should also remember that people are not likely to accept that this is an illusion when it does not suit their interests. This is especially a case when they are surrounded by community of like-minded believers, which reinforces faith in any proposition. The latter applies to any sort of belief. • Our tendency to construct and believe coherent narratives of the past makes it difficult for us to accept the limits of our forecasting ability. This is because of the role of chance in determining both past events and future events, and our failure to account for chance in making forecasts. In fact, experts with the most knowledge have been found by studies to give less accurate forecasts because they tend to be overconfident in their abilities, fall for the illusion of skill, and give more specific and therefore less accurate forecasts. Sometimes, those with some degree of knowledge, but not a high degree of knowledge, can give more accurate forecasts for this reason. Experts should be trusted more when they are complex thinkers, do not explain everything through one single theory, are able to admit that they are wrong or could be wrong, and include the role of luck in their forecasts. o This should lead me to be wary of forecasts by experts, especially when these experts give very specific forecasts, when they explain everything through one single theory, do not exhibit complex and multivalent thinking, do not exhibit the ability to admit they might be wrong, and to not include the role of luck in the forecast.


Chapter 21, Intuitions Vs. Formulas • Numerous studies have shown that statistical formulas do better than most highly-qualified experts in predictions of outcomes. Experts are often inferior to algorithms because experts try to be clever, think outside the box, and consider complex combinations of features in making their predictions. Complexity may work in the odd case, but more often than not it reduces validity. Simple combinations of features are better in all except odd and outlier cases. Another reason for the inferiority of expert judgment is that humans are surprisingly inconsistent in making summary judgments of complex information. Because you have little direct knowledge of what goes on in your mind, you will never know that you might have made a different judgment reached a different decision and are very slightly different circumstances, due to priming and other factors. So to maximize predictive accuracy, final decisions should be left mostly to formulas, especially in low-validity environments. o This implies to me that statistical formulas should be trusted in most cases more than the judgments of experts, and that experts should be trained to rely on statistical formulas and override them only in the odd, outlier cases of unusual circumstances. The same should also imply to me that the decisions of experts can change depending on slightly different circumstances, and I should keep this in mind for the future. • To hire the best possible person for a job, choose a few traits that are prerequisites for success in this position. The traits you choose should be as independent as possible from each other and you should be able to assess them reliably by asking a few factual questions. Next make a list of those questions for each trait and think about how you will scored on a 1 to 5 scale. To avoid hello facts, you must collect information on one trait that time and score each before you move on to the next one. To evaluate each candidate, add up the scores. Try to resist your wish to invent exceptional circumstances to change the ranking. o The system should remind me of the effectiveness of using standardized questions to evaluate candidates for jobs and other positions.

Chapter 22, Expert Intuition: When Can We Trust It? • Effective intuition is basically a function of associative memory, where the situation provides a cue, the queue gives the expert access to information stored in memory, and information provides the outlines of an answer. Intuition is thus essentially recognition, based on previous knowledge, and is a norm of mental life. Such intuition develops from our early childhood, and applies to all areas of our lives. We develop intuition especially quickly when it is associated with emotional experiences, especially aversive emotional experiences, such as fear. • Expert intuition is not quite the same as emotional learning. Expert intuition takes a long time to develop. The acquisition of expertise in complex tasks, such as reading, chess, or other complex activities is intricate and slow, because it is usually not associated with highly charged emotional situations. Furthermore, expert intuition is usually not a single skill but rather a large collection of mind skills. • Likewise, only certain experts have an opportunity to develop an effective intuition that is actually helpful for solving problems in their professional area of expertise. The two basic conditions for acquiring a skill are: 1) an environment that is sufficiently regular to be predictable; 2) an opportunity to learn these irregularities through prolonged practice. Statistical algorithms in fact greatly outdo human beings and noisy environment for two reasons: they are more likely than human judges to detect weakly valid cues and are much more likely to maintain a modest level of accuracy by using such cues consistently. o I need to remember based on this that I should evaluate expert predictions based on evaluating the level of predictability within their environment, and on the opportunities to learn these regularities by them through prolonged practice in these environments. I should trust the validity of intuition of those experts within such environments, and not trust the validity of experts in other environments, along with evaluating the opportunities for practice by these experts. • The conditions for learning skills are most conducive in environments that provide immediate and unambiguous feedback on performance, and allow the individual learning the skill to adjust and improve. Thus, whether professionals have a chance to develop intuitive expertise that is effective depends essentially on the quality and speed of feedback and sufficient opportunity to practice. o I need to remember the conditions that are best suited for learning skills. I can consider structuring my teaching activities to encourage students to provide feedback for each other in order to provide opportunities to create conditions that are best suited for learning skills. • Expertise is not a single skill but a collection of skills, and the same professional may be highly expert in some of the tasks in her domain while remaining a novice in others. Furthermore, some aspects of any professional’s tasks are much easier to learn than others, due to to the immediate feedback and opportunity to practice associated with these skills. An expert may often have difficulty differentiating between the skills they are highly proficient in and the skills they are less proficient in, and perceive themselves as having high subjective confidence in their judgment and activities in both areas. This implies that the confidence that experts have in their intuitions is not a reliable guide to their validity. Thus you should not trust anyone including yourself to tell you how much you should trust their judgment. Instead, when evaluating expert intuition is to always consider whether they acquired their skills-based on a regular and predictable environment, and on an adequate opportunity to practice their skills and receive regular feedback on their skills. o I should remember, for myself and for others, expertise is a collection of skills, and both I and others might have strengths in one area but not in another. However, I and others might have difficulty distinguishing the areas in which we have real effective expert skill and areas in which we lack such skills, due to lack of a predictable environment and/or due to insufficient opportunity for practice and/or due to lack of sufficient and timely feedback. Thus, I and others might form highly subjectively confident evaluations of our skill and of a situation within our expert area that may not actually match reality. I should watch out for this, in myself and in others.

Chapter 23, The Outside View • The inside view of a specific situation refers to insiders focusing on their individual specific circumstances and searching for evidence in their own experiences. By contrast, the outside view evaluates a specific situation by placing that situation within a broader reference class, and making a ballpark evaluation based on that broader reference class and base rate, while considering information specific to a particular situation to adjust the base rate. However, in the competition with the inside view, the outside view stands for chance, because insiders frequently prefer to trust their own instincts and evaluations as opposed to considering the actual base rate and reference class. o I should remember the difference between inside view and outside view. I should trust the outside view more than the inside view when evaluating specific situations. I should also remember that both I and others have an internal preference for the inside view, and add that to my understanding. • Planning fallacy refers to failure to plan for anything but the best-case scenario, both in regard to time and to resource use. Planning fallacies can be mitigated by examining scenarios other than the best cases, and by consulting the statistics of similar cases – in other words, considering the reference class and its base rate. Planning fallacies by organizations and by individuals in terms of resource overrun often occur because people are unable to imagine how much their wishes will escalate over time, and and up putting in much more resources than they would have if they had made a realistic plan active. Furthermore, the authors of unrealistic plans are often driven by their desire to get the plan approved, supported by the knowledge that projects are rarely abandoned unfinished merely because of cost overruns, which sometimes results in sunk costs fallacy. A specific technique to deal with planning fallacy is reference class forecasting: 1) identify an appropriate reference class; 2) obtain the statistics of the reference class and use the statistics to generate a baseline prediction; 3) you specific information about the case to adjust the baseline prediction. o I should be wary of planning fallacy, both in regard to time and to resource use. I should remember to make efforts to mitigate my own planning fallacies in all aspects of planning by considering how much time and resources previous projects or activities of this kind took to do, and planning accordingly. I should also remember this for organizations I am involved in. I should also remember this for my estimations of understandings of how other people and organizations do things. • Likewise, organizations face the challenge of control and the tendency of executives competing for resources to present overly optimistic plans. In decision-making, the optimistic bias is a significant source of risk taking, and executives make decisions based often and delusional optimism rather than a rational weighing of prospects. o I should remember the challenges of the optimistic bias for decision-making, namely for its contribution to unwarranted risk taking.

Chapter 24, The Engine of Capitalism • Many people have a strong optimistic bias. Most of us few the world is more benign than it really is, our own attributes is more favorable than they really are, and the goals it up as more achievable than they are likely to be, and we tend to exaggerate our ability to forecast the future and that fosters optimistic overconfidence. Most people genuinely believe that they are superior to most others on most desirable traits. o I should remember this about people and about myself, namely that they have a tendency to be more optimistic than the world really is, see themselves as more favorable than they really are, and see themselves as more likely to achieve their goals and forecast the future than is the reality. • Optimists generally tends to be cheerful, happy, and therefore popular, resilience, healthier, etc. the blessings of optimism are offered mostly to individuals who are only mildly biased, and who are able to accentuate the positive without losing track of reality. Optimistic individuals play a disproportionate that all that in shaping our lives. Their decisions make a difference as they are the inventors, the entrepreneurs, the political and military leaders and economic managers. They get to where they are by seeking challenges and taking risks. They are talented and they have been lucky, almost certainly luckier than they acknowledge. Their confidence in their future success sustains a positive mood that helps them obtain resources from others, raise the morale of their employees, and enhance the prospects of prevailing. So, the people who have the greatest influence on the lives of others are likely to be optimistic and overconfident and to take more risks than they realize. So optimistic bias plays a role and sometimes the dominant role in whether individuals or institutions voluntarily take on significant risks. Optimism is therefore stubborn and costly, for individuals and for institutions, in addition to its benefits. o I should remember the implications of these points. Namely, the people who are in leadership positions tend to be optimists, and therefore take more risks than they realize, be stubborn, and make costly decisions, in addition to the positive qualities. I should also remember these negative implications of an optimistic orientation, which I think I share and which I need to work on. The same goes for many other people who are optimists. • Optimistic CEOs tend to own more of the company stock personally intend to take excessive and unwarranted risks. In fact, research shows that leaders of enterprises make unsound that’s to not do so because they are getting with the money of other people, but that they do so when they are personally more at stake. The damage caused by these overconfident company leaders when the business press treats them as celebrities, and is quite costly stockholders. • We tend to engage in competition neglect, meaning that we focus on what we want to do and can do while neglecting the plans and skills of others. People tend to be overly optimistic about the relative standing in activity in which they are only functioning moderately well, like driving. People also tend to overestimate the extent to which their fate is in their own hands. o For me the implication of these points is that they need to remember the extent to which others are also engaged in competing with and not discount the skills and abilities when evaluating the probability of success of any project that involves interaction with an especially competition with others. I also need to remember that others will tend to neglect the skills and plans of those were not themselves. • Overconfidence is another manifestation of WYSIATI. When we think about something, we naturally turned to the information that comes to mind and construct a coherent story that makes sense. We do not allow for the information that does not come to mind because it can not know it. Despite is problematic tendency, optimism is highly valued, socially and in the marketplace, since people and firms reward the providers dangerously misleading optimistic information more than rewarding truth tellers. There is a desire among firms individuals for confident and optimistic styles, rather than those who can have an unbiased appreciation of uncertainty. o For me, I need to remember that it is more effective to be wary of high confidence that is based on uncertain information and to avoid trusting those who express such confidence, and instead reward those who express certain. I should also remember that this is not how the situation works for other people and use this knowledge in my analysis. • The expects of high optimism on decision-making are mixed, but the contribution of optimism to good implementation is positive. The main benefit of optimism is resilience in the face of setbacks. The optimistic style involves taking credit for successes but little blame for failures. o I need to remember the benefits of optimism for implementation, namely resilience in the face of setbacks. I should remember that the optimistic style of taking credit for success but will blame for failures, while it may be helpful for an individual person, is not the kind of behavior I want family and can be poisonous for organizations. So, I should encourage within myself resilience in the face of setbacks but not the style of taking credit for successes but little blame for failures, at least to my internal self and private world. The optimistic style, however, may be functional for me in my professional interactions within those since that seems to be how many people get ahead. I should also remember the benefits of both resilience in the face of setbacks and optimistic style of taking credit for successes and little blame for failure in my analysis of situations and the way other people interact with each other. • Organizations may be better able to tame optimism them individuals are. The best idea for organizations to obtain optimism is something called a pre-mortem. The procedure is as follows: when the organization has almost come to an important decision but has not formally commended itself, you should gather for a brief session a group of individuals were knowledgeable about the decision. Discuss the following: “imagine that we are here into the future. We’ve implemented the plan is now exists. The outcome was a disaster. Please take 5 to 10 minutes to write a brief history of that disaster.” This procedure has two main advantages: it overcomes the groupthink that affects many teams once a decision appears to have been made, and it unleashes the imagination of knowledgeable individuals in a much-needed direction. The main virtue of a pre-mortem is that it legitimizes doubts, and makes them not seen as evidence of flawed loyalty to the team and its leaders. Furthermore, it encourages even supporters of the decision to search for possible threats that they had not considered earlier. o I think this is a brilliant idea and I think this is something to adopt for individuals, organizations, enterprises, and government bodies. I think I need to promote the idea of a premortem widely and applied to my life, to organizations in which I am involved in, to academic institutions, and elsewhere as well.

Chapter 25, Bernoulli’s Errors • Most people dislike risk, and if they are offered a choice between a gamble and an amount equal to its expected value they will pick the sure thing. In fact, most people are risk-averse and risk-averse decision-makers will choose a sure thing that is less than expected value meaning that they pay a premium to avoid the uncertainty. • People experience pleasure and happiness, or disappointment and sadness, over their current conditions always in reference to their previous condition, the reference point. This is called reference dependence, and it is ubiquitous in sensation and perception. The same state of wealth will be seen as pleasant or unpleasant depending on the previous state of wealth. The same sound will be experienced as Faye allowed for very quiet, depending on the previous sound. o I need to remember the importance of reference points reference dependence in explaining people’s actions. People always experience pleasure or disappointment based on their previous reference points and the strength of change from the previous reference point. The same can be said about dictations, namely that people base their behavior on their dictations and that they are pleased when their expectations are and are displeased when their expectations are not. So it is all about reference points and expectations. I should remember that this applies not only myself to all other people as well, and included in my analyses of the situation, whatever the situation may be. • Bernoulli’s theory of economics informs the traditional approach to economics. This approach is based on utility, and is basically a model in which people’s attitudes to wealth corresponds to the percentage change in their state of wealth. Utility theory believes that people only care about the utility to which they could put a certain wealth, and makes all of its assumptions based on this premise. The author criticizes this theory for lacking an understanding of the importance of reference points, and of evaluating people’s economic and risk behavior based on their reference points. o I agree with the author. • There is a weakness in the scholarly mind that the author terms the re-induced blindness, namely that once you have accepted the theory and use it as a tool in your thinking it is extraordinarily difficult to notice its flaws. If you come upon observation does not seem to model, you assume that there must be a perfectly good explanation that you are somehow missing that does not require revision of the model itself. The tendency is to give the theory the benefit of the doubt, trusting the community of experts who have accepted it. When overcoming theory-induced blindness, you know it is a serious case when they gear you got rid of seems that only false but absurd, when you can no longer reconstruct why you failed to see the obvious. o I need to see, for myself and others, the problem of theory-induced blindness and how it can impede my understanding of the world, and also how other scholars understand the world. I also think that this idea of.-Induced blindness can be applied to non—scholarly activities, namely where certain mental models the world are assumed to describe the world and all behavior that does not fit is not. I’m sure that I myself am guilty of this behavior, and I need to work on myself to figure out how to minimize permitting the mental models I have of the world to allow me to have an unrealistic view of the world. I also need to remember that others have unrealistic views of the world based on their mental models.

Chapter 26, Prospect Theory • Prospect theory is an idea that introduces the concept of reference points to changes of wealth, namely making the argument that one’s previous reference point will determine one’s attitude toward changes of wealth and risk. Prospect theory criticizes utility theory. There are free cognitive features of the heart of prospect theory: 1) Evaluation by economic agents is a relative to a neutral reference point, sometimes referred to as an “adaptation level.” Outcomes that are better than the reference point are received as gains and those of the worst in the reference point are losses. 2) The principle of diminishing sensitivity applies to both sensory dimensions and changes of wealth. 3) The third key element is loss aversion. o It makes a great deal of sense to me that people’s nucleation of their current wealth depends on their change in the wealth, and the previous reference point. This to me speaks of the importance of evaluating both my previous reference points and the reference point to others in evaluating how satisfied they will be with a change in their circumstance. I should also remember how this functions and brother side, namely people’s sadness or happiness about their current circumstances based on their previous circumstances. • People tend to be loss averse, requiring about 1.5 to 2.5 times the reward to take a risk (all this applies to sums of money that are not going to powerfully influence people’s lifestyle, which is what the vast majority of everyday economic interactions are about). In mixed gambles, where both again and the loss of possible, loss aversion causes extremely risk-averse choices. In bad choices, where sure losses compared to larger loss that is merely probable, diminishing sensitivity causes risk seeking. Namely, rather than take a sure bet that they will lose a smaller amount, they are willing to take the risk of losing a lot to ensure the possibility of not losing anything at all, because people generally tend to be loss averse. Moreover, there is a diminishing marginal cost to losing every additional dollar. Prospect theory describes actual human beings that are guided by the immediate emotional impact of gains and losses, not by long-term prospects of wealth and utility, the “rational” individual resumed by mainstream economic theory. Still, the author describes how prospect theory has some blind spots: namely, it does not deal well with changing expectations over time that set up a tentative reference point, it does not well deal with disappointment or regret. o I need to remember for myself that people tend to be risk-averse for gains, requiring about 1.5 to 2.5 times the reward to take a risk, but that at this causes risk seeking behavior when people are faced with losses. Namely, risk aversion is overcome by loss aversion. More broadly, this points to the importance for me of thinking of the immediate emotional impact of gains and losses is guided in their decision-making within the marketplace, as opposed to more long-term decisions and orientations guiding people. I have should remember to apply this to myself and to remember to be more realistic about my economic decision-making of when to take risks and how to deal with losses. I should also remember how other people will tend to do this in my analysis of situations.

Chapter 27, The Endowment Effect • Generally, the disadvantages of a change loom larger than its advantages, inducing a bias that favors the status quo. Loss aversion implies that choices are strongly biased in favor of the reference situation. However, tastes are not fixed and once a new reference points est. that becomes the new orientation that determines how people react. o I should remember that, for myself and others, loss aversion implies a bias toward status quo, the reference situation. However, tastes to change and new reference points will establish a new status of status quo, which will again be hard to change. • The endowment effect refers to people preceding goods that they own is worth more than the goods that they don’t. Indeed, owning the good creates a specific reference point, placing it in the situation of loss aversion that will cause somebody to demand more money to give up a good as opposed to paying money to get that. This applies to goods that are intended to be used by the person who owns them. o I should remember the endowment effect, and see how it applies to my life. I probably do perceive consumer goods that I own as more valuable than those consumer goods that I do not own, even though I may not consciously recognize it right now. I should also remember that this will apply to other people as well.

Chapter 28, Bad Events • In human thinking, the negative trumps the positive in many ways. Loss aversion is one of many manifestations of this broad negativity dominance, with human beings generally driven stronger to avoid losses than to achieve gains. This generally applies to the reference point of the status quo, but it can also be a goal in the future. Not achieving the goal is perceived as a loss while exceeding the goal is a gain, and their version to the failure of not reaching the goal is generally much stronger than the desire to exceed it. So setting goal and not achieving it can be perceived as a loss, and indicates the need to adjust goal-setting to have more process oriented goals rather than outcome driven goals. o I should remember that in human thinking, including my own, the negative tends to trump the positive. This applies to many situations in society, and I need to remember that in my analysis. I should also remember the benefits of setting process goals rather than outcome goals. Certainly, this is especially beneficial for Asya and other loss-averse people. • Loss aversion also creates the symmetry that makes agreements difficult to reach. The concessions you make to me are my gains, but they are your losses, and therefore cause you much more pain than they give me pleasure, and vice versa. Many of the messengers that negotiators exchange our attempts to communicate a reference point and provide an anchor to the other side, with some of these messages insincere. Because negotiators are influenced by a norm of reciprocity, a concession that is presented as painful calls for an equally painful and perhaps equally in a frantic concession from the other side. o I should remember how negotiations function, namely that negotiators perceive their concessions as more important than the gains they get from the other side, which makes reaching agreement especially challenging, especially with insincerity in play. • Loss aversion is a powerful conservative force that favors minimal changes from the status quo in the lives of both institutions and individuals. The fact that people fight harder to defend against losses then to achieve gains makes it challenging to reform the status quo, creating a status quo bias. When people try to change themselves, they inherently needs to give up or lose a part of themselves, and this part is very visible while the gains to be made are not very visible, causing combination of loss aversion and attention bias to undermine efforts at change. The same dynamics impact any organization or institution attempting reform. Generally, plans at the form have brought benefits, with overall more winners but some losers. The losers will tend to fight harder and more actively because they are losing, as compared to the winners, especially when the winners are not well organized and united because they do not yet see the benefit of winning. This creates outcomes that are biased in favor of losers as opposed to potential winners. o I should remember the negative impact of loss aversion on my life and those of others. Loss aversion favors the status quo, as opposed to more optimal situations. Loss aversion undermines the efforts by individuals and organizations to reform themselves, which is relevant to me in my life when I tried to reform myself and situations, and also in my analysis of the situations around me. • The basic principle by which people evaluate fairness in economic transactions revolves around the current reference point. Anything that violates the reference point is perceived as unfair, as an entitlement that must not be infringed. So affirm is perceived as unfairly infringing on the rights of its customers when it exploit situations to raise its prices. However, a firm is perceived as acting fairly if it raises prices because its own costs have increased, and the firm needs to maintain a profit. Studies have shown that employers who violate rules of fairness are punished by reduced productivity and merchants will follow unfair pricing policies can expect to lose sales. o Remember, for myself and others, that the reference point provides an anchor by which we evaluate economic and other situations. Loss aversion plays a key role here, as we perceive changes in the reference point as fair only when they are necessary for firms to avoid losses, as opposed to making gains, which are perceived as unfair and exploitative. This is something to remember in evaluating how others perceive situations.

Chapter 29, The Fourfold Pattern • Whenever you form of global evaluation of a complex subject you assign weights to its characteristics, which means that some characteristics influence your assessment more than others do. The possibility effect causes highly unlikely outcomes to be weighted disproportionately more than they deserve according to probability. When speaking about gains, we tend to overweight the possibility of small gains and pay far more than the expected value for them, such as lottery tickets. When speaking about losses, because of the possibility effect, we tend to overweight small risks and are willing to pay far more than expected value to eliminate them altogether. The certainty effect refers to people underweighting outcomes that are almost certain by comparison to outcomes that are certain. When speaking about gains, we tend to overweight the move from almost certain to certain situations, and pay far more than the expected value for them. When speaking about losses, because of the certainty effect, we tend to underweight the risk of a larger loss in comparison to a sure loss. The combination of the certainty effect and possibility effect at the two ends of the probability scale is accompanied by inadequate sensitivity to intermediate probabilities. o I need to remember that I can like others, and vulnerable to the probability affect of overweighting the move from zero to a small probability and the certainty effect, the move from almost certain to certain outcomes, and am inadequately sensitive to the transitions in the probabilities in between the two endpoints. I should evaluate decisions based on the actual probability, and not my intuitive feeling about. I should also remember that others are vulnerable to these and evaluate the behavior of others accordingly in my analysis. • The fourfold pattern of preferences is a core achievement of prospect theory that predicts the way people will behave based on the probability affect and the certainty effect. o Example o Gains o Losses o High Probability (Certainty Effect) o 95% chance to win $10,000. Fear of disappointment. RISK AVERSE. Accept unfavorable settlement o 95% chance to lose $10,000. Hope to avoid loss. RISK SEEKING. Reject favorable settlement. o Low Probability (Possibility Effect) o 5% chance to win $10,000. Hope of large gain. RISK SEEKING. Reject favorable settlement o 5% chance to lose $10,000. Fear of large loss. RISK AVERSE. Accept unfavorable settlement o The top left indicates that people are first the risk when they consider prospects with a substantial chance to achieve a large gain, and they are willing to accept less than expected value of a gamble to lock in a sure gain. o The bottom left cell explains why lotteries are popular namely that when the top prizes they large ticket buyers appear indifferent to the fact that their chances of winning is miniscule. Of course what people acquire the ticket is more than chance to win, instead it is the right to dream pleasantly of winning. o The bottom right cell is where insurance is bought, namely with people being willing to pay much more for insurance and expected value to eliminate the worry and purchase peace of mind. o The top right cell indicates that people facing a sure loss vs. a gamble for a larger loss are willing to be risk seekers, due to diminish sensitivity and the certainty effect. Many unfortunate human situations unfold in the top right cell, for example with people who faced that options taking desperate gambles which turns manageable failures into disasters. o When you take a long view of many similar decisions, these deviations are likely to be costly for those who engage in such bad decision-making, leading to suboptimal outcomes.  This fourfold pattern describes well the kind of decision-making that I should try to avoid in order to make optimal decisions. I should remember that over a series of many decisions, falling prey to the probability affect and the certainty effect leads to substantially inferior outcomes. I should also remember that the people I analyze in my everyday life and scholarship generally do not know this and engage in such inferior decision-making.

Chapter 30, Rare Events • Availability cascades arise due to rare and spectacular events such as terrorism. A bit image is a reinforced by media tension and frequent conversations, becoming highly accessible in the memory. The consequent emotional arousal is associative, automatic, and uncontrolled. The motion is not only disproportionate to the probability, but is also insensitive to the exact level of probability. Emotion vividness influence fluency, availability, cognitive ease, and judgments of probability, which accounts for our excessive response to spectacular rare events. At the same time, some rare events are ignored in our memory. Still, overall people overestimate the probabilities of unlikely events, and overweight unlikely events in their decisions. Our mind tends naturally to focus spontaneously on whatever is or unusual, with this event becoming focal. o I need to remember that I and the others around me, are vulnerable to availability cascades and to overweighting spectacular and vivid events. This is especially important in my analysis, as it is something that powerfully impacts society. • The denominator bias refers to the fact that if your attention is drawn to a positive framing of the probability of winning, you neglect the probability of losing. This allows unscrupulous communicators to focus people’s attention either on winning or losing, biasing their evaluation of the situation. For example, low-probability events are much more heavily weighted when described in terms of relative frequencies than one stated that more abstract terms of percentages. It takes an exceptionally active System 2 to generate alternative formulations of the one you see in front of you and to discover that they invoke different response. Likewise, more vivid descriptions produce a higher decision weight for the same probability. o Remember that I and others are vulnerable to framing effects in descriptions of probabilities. Namely, the perception of probability depends on whether attention is drawn to winning or losing, to the frequency vs. the percentage, and to the vividness. Remember this also in my analysis of situations. • Another problematic element arises in choices about probabilities from experience as opposed to from description, with people underweighting the possibility of an event occurring if they did not experience it themselves. o This is useful for me to keep in mind in that I should not underestimate events that I did not experience, and to keep in mind that others will tend to underestimate expense did not experience, if the decisions are made based upon experience as opposed to description.

Chapter 31, Risk Policies • Over time, it is costly to be risk averse for gains and risk seeking for losses. These attitudes make you willing to pay a premium to obtain a sure gain rather than face a gamble, and also willing to pay a premium and expected value to avoid a sure loss. As a result, your overall decision-making is likely to result in inferior, suboptimal outcomes. • Such decision making results from narrow framing, which sees every situation as an individual situation instead of a broader framing that considers each situation as only one among many similar situations. If we frame our thinking broadly, and see each situation where we can gain or lose us part of a broader framework, we can make more rational choices that lead to optimal outcomes. To do so, we need to see each individual scenario as part of a series of small scenarios throughout our lives, and acknowledge to ourselves that what we want to do is optimize the long-term outcome, as opposed to letting our emotional reaction to short-term gains and losses determine our decision-making. A useful mantra to deal with the emotional responses to losses in this case is “you gain a few, you lose a few.” Such an approach is similar to an outside view, which shifts the focus from the specifics of the current situation to the probabilities of outcomes in similar situations. If we decide to pursue an optimal outcome in all situations as opposed to orienting toward gains or losses in each individual scenario, we can adopt this approach of broad framing that will guide our decision-making for each individual situation. This broad framing approach is often called a risk policy, where the pain of small losses is ameliorated by the fact that you know that overall, your approach is most likely to be advantageous for your life. o I think this broad framing and the risk policy approach is very wise, and this something that I should follow in my own life. I should adopt for myself the approach of thinking within the terms of setting each individual decision within a broader framework of my life, and making decisions that are most likely to benefit me over the long-term, as opposed to worrying over the immediate consequences of losses and gains. If I acknowledged to myself that my broad framing and risk policy will be most likely to be advantageous to me over the long-term, I can deal with the consequences of short-term losses. I should also remember that most other people do not have this approach toward broad framing and the risk policies, and analyze and deal with situations in my life and in my scholarship accordingly. • Regarding investing, this approach to risk policy is the best option to follow. It might be most effective to avoid the frequency with which you check how well your investments are doing if you are long-term investor. Once a quarter is enough. This approach works when you do not invest in a single industry that goes bad together, when the possible loss does not cause you to worry about your total wealth, and that should not be taken as an approach to longshots. o I should remember to adopt this approach to investing.

Chapter 32, Keeping Score • Money is a proxy for points on a scale of self-regard in the achievement, which shapes our preferences and motivates our actions, like the incentives provided in social environments. As a result, we refuse to cut losses when doing so would admit failure, we are biased against actions that could lead to regret, and majora and illusory but sharp distinction between omission and commission, not doing and doing, it is the sense of responsibility is greater for the second rather than the first. Such mental accounting is a form of narrow framing that keeps things under control and manageable by our mind, but, together with the emotions that we as human beings attach to this mental accounting, such narrow framing leads to inferior outcomes. • One example is the disposition effect, where investors prefer to sell winners rather than losers, because they prefer to close a mental account with a gain rather than a loss. However, recent winners have been shown by research to tend to go up further for a short period of time, and selling winners therefore results in people losing out on gains. It is more effective to analyze the situation and predict which of your stocks will do better in the future and sell the ones that will do worse, regardless of whether they have gained or lost. o I should remember this for myself to not try to sell recent winners or losers, but evaluate how the stocks will do in the future as I’m making a decision of what to sell or what to buy. • The sunk-cost fallacy refers to making inferior decisions because you already invested some resources into them. This has to do with human beings wanting to close a mental account with a gain, not a loss. In organizations, it might be actually rational for an individual manager to invest resources into a failing project, in order to prevent the project from failing. This is not rational for the organization, but it fits the individual incentives of the manager to avoid being associated with a failing project, and is a broader example of the agency problem within an organization. The sunk-cost fallacy keeps people for too long in poor jobs, poor relationships, and poor projects. o I should remember for myself the tendency of myself and of others to fall for the sunk-cost fallacy. I should also remember this when analyzing situations. • Regret is an emotion, and it is also punishment that with minister to ourselves. If you do regret is a factor in many of the decisions that people make. Decision-makers told that they are prone to regret, and anticipation of the painful emotion plays a part in many decisions. However, such anticipation of regret can lead to many inferior outcomes. o I should remember to not let the anticipation of regret guide my behavior, but to make my decisions based on my goals and future orientation, and to frame them broadly, rather than narrowly. • People expect to have a stronger emotional reaction, including regret, then outcome that is produced by action than to the same outcome when does produce plane action. The key here is not the difference between action and not action, but the distinction between the default option and the actions that deviate from the default. When you deviate from the default, you can easily imagine the norm, and if the default is associated with bad consequences, the discrepancy between the two can be the source of painful emotions. It is a departure from the default that produces the most regret. This symmetry in the risk of regret favors conventional and risk-averse choices. o I should remember that people, myself included, tend to perceive commission as more problematic than omission in most cases when evaluating situations, although doing and not doing are both choices that people make. I should also remember that people perceive deviation from the norm as more problematic and risky than sticking with the norm, due to risk aversion owing to perceptions of more weight for losses rather than gains. Remember to try to avoid these emotions when making my own decisions. Also remember that other people will tend to have these emotions and let these emotions guide their decision-making when I analyze situations, in my life and my research. • A sense of responsibility increases one’s sensitivity to losses rather than gains. This applies especially strongly to areas of life other than money, for example health, safety, relationships, and other areas. This is because such areas have moral implications that money does not. o I need to remember that human thinking is particularly susceptible to sensitivity to losses rather than gains in questions having to do with morality, such as health, safety, relationships and others. I myself am prone to this thinking, and need to try to have a broad framing, future orientation, and goal achievement perspective on these questions. I should also remember that other people are likely to have these fallacies, and analyze situations accordingly. • To inoculate yourself against a grad, it is useful to prepare in advance for it. Namely, before making decisions, think about the possibility of regret when making this decision. If the decision does go badly, you can remember that you thought about the possibility regret, and this will tamp down the motion of the grid. Similarly, you can tamp down hindsight bias in this manner. Namely, recall that you have thought through your decision and made the best one possible given the information available, or that you made a decision very casually or gave somebody else the responsibility for making this decision. Doing so will help tamp down your hindsight bias of believing that you should’ve made a different decision. o I should adopt the strategies of considering regret before making decisions and of making a decision thoughtfully, either through careful consideration or for handing it off to others, to tamp down feelings of regret and hindsight bias afterward.

Chapter 33, Reversals • We normally experience life in the between-subjects mode, in which contrasting alternatives that might change your mind are absent, and as a result WYSIATI predominates. As a consequence, the belief that should endorse when you deflect about morality did not necessarily govern your emotional reactions, and the moral intuitions that come to your mind in different situations are not internally consistent. The consequence of this is preference reversals when situations are framed in different ways, because different framing causes you to pay attention to different elements of the situation. This is why a broader evaluation, a broader framing, is more functional, in that it demands a more careful and systematic evaluation that turns on System 2. This relates to the principle that rationality is best served by broader and more comprehensive frames, and joint evaluation is more effective than single evaluation. o I should remember that narrow framing leads to attention being drawn to specific elements of a situation and emotional, System 1 evaluations, by myself and others. For myself, I should remember to adopt broad framing and brought evaluations and evaluate any situation within a broader context in order to make the most effective decisions. I should also remember that people tend to not do this and analyze situations accordingly.

Chapter 34, Frames and Reality • Meaning relates to what happens in your associative missionary as you try to understand the situation. This makes it impossible for humans to be fully rational, because associative missionary is not inherently rational. Our associations are generally bound by the frame within which they are presented, and most people avoid trying to reframe situations in order to see the situation from a different perspective that might be more effective for making the decision. Thus, it might be most effective to reframe situations in order to make the most effective decisions, keeping in mind that broader frames and inclusive mental accounts generally leads to more rational decisions. o Remember that I and others as well are susceptible to framing biases, and remember that I need to reframe decision problems with which I am presented in order to get at the most effective solutions. Also think about how I present decision problems to other people, and make sure to frame them in a way that I want them to be framed. Also remember that other people will tend not to reframe problems for themselves, and keep this in mind when analyzing situations.

Chapter 35, Two Selves • There is a fundamental difference between the experiencing self and the remembering self. The experiencing self is the one that answers the question: “how do I feel now?” The remembering self is the one that answers the question: “what was that experience like?” Memories are all we get to keep from our experiences, and the only perspective that we can adopt as we think about their lives is therefore that of the remembering self. Because of this, we tend to confuse experience with the memory of it, and it is this substitution that causes are understandings of our past experiences to change over time. The experiencing self does not have a voice, and the remembering self is sometimes wrong, but it is the one that keeps core and governs what we learned from living and makes decisions. This suggests that we should orient our experiences toward the remembering self, not the experiencing self. o This is a powerful concept, and one I should bring into my life. Remember that I can only define myself by my memories. Therefore, I should orient toward the future, toward my remembering self in the future, and that I should orient my life so that the remembering self has the best experience based on my memories. • There are two key elements of memory about any experience that determine our overall impression from that experience. These are, first, the peak-end rule: namely, that are global evaluation of an experience depends upon the average level of the experience reported at its maximum moment and at the end of the experience. Moreover, we generally underweigh the duration of the experience. o I should remember to apply this to my life, and make sure that my peak experience and and experience is optimal, and also to remember that the duration does not matter as much as it intuitively seems to do. So it would be better to have more shorter experiences than fewer longer ones. • Tastes and decisions are shaped by memories, and the memories can be wrong. This evidence presents a profound challenge to the idea of human beings to rational agents that have consistent preferences and know how to maximize them, which is a cornerstone of the rational-agent model. Inconsistency is built into the design of our minds. We have strong preferences about the duration of our experiences of pain and pleasure. But our memory, which is a function of system one, has evolved to represent the most intense moment of an episode of pain or pleasure and the feelings when the episode was at its end, and to underweigh duration. o I should remember for myself and for others that my tastes and decisions are not consistent, and evolve over time. I should remember this when analyzing situations as well.

Chapter 36, Life as a Story • The remembering self works by composing stories and keeping them for future reference. Duration neglect is normal in the story, and the ending often defines the story. The same core features appear in the rules of narratives and in the memories of our experiences. Caring for people often takes the form of concern for the quality of their stories, not for their feelings and thoughts. Most important, we all care intensely for the narrative of our own life and very much wanted to be a good story. o I should remember that the remembering self works by composing stories, and that our memories, including mine, are characterized by duration neglect, focus on the peak experience, and focus on the ending. I should also remember this for my analysis of situations.

Chapter 37, Experienced Well-Being • Attention is key to well-being. Our emotional state is largely determined by what we pay attention to. We are normally focused on the current activity in the immediate environment, and drop pleasure and pain from what is happening at the moment. One implication is that we need to arrange more of our lives daily experiences are filled with more moments of pleasure and joy, as opposed to stress and pain. Another implication is that we need to pay more attention than we currently are to the pleasurable moments in our memories of our lives. Social contact, physical health, and situational factors combine to be most important determining our moment-by-moment happiness, and it is only a slight exaggeration to say that happiness is the experience of spending time with people you love and who love you. o For me, the implication here is the need to pay attention to the pleasant moments in my life and to focus my attention on those pleasant moments and memories. At the same time, I should strive to increase the amount of pleasant moments in my life and decrease the amount of unpleasant ones. Finally, I need to remember the importance of social contact and physical health for ensuring more pleasant moments in my life. • While being poor undermines happiness, beyond a certain middle-class income the amount of money does not an average improve experienced well-being, although it does improve perceptions of life satisfaction. Now, higher income undoubtedly permits the purchase of many pleasures in life. However, studies show that higher income is associated with a reduced ability to enjoy the less expensive pleasures of life. o To me, this indicates the importance of focusing on enjoying less expensive pleasures in life, and not worrying about the fact that my income will not match that of many of my friends. After all, I am at that income level that permits me to enjoy a fully satisfying and happy life and greater income would not cause me to have a better experienced well-being. Thus, I should be happy with what I have and enjoy it, while spending my time and efforts and social contacts, and physical self, and other things that will it prove my experienced well-being an memories of life.

Chapter 38, Thinking about Life • Affective forecasting refers to the predictions that people make about their emotional states in the future. • Setting goals can be beneficial for achieving those goals, but goals that are very high can also lead to dissatisfaction and unhappiness through not achieving those goals. So it might be most beneficial to set realistic and achievable, yet challenging goals. o I need to remember that setting goals that are too high may result in unhappiness, and keep this in mind when recommending goalsetting strategies to others. • The focusing illusion refers to the fact that any aspect of life to which attention is directed will loom overly large in a global evaluation, substituting that part of life for the whole of it. In other words, nothing in life is as important as you think it is when you’re thinking about. o I am sure that I am prone to a focusing illusion, and they need to remember for myself but nothing in life is as important as they think it is while I’m thinking about it. Remember that anything in which I’m focused will determine my broader evaluation of a topic in an excessive way, and try to correct for this illusion. Moreover, remember that other people will function and think in the same way, and remember this when analyzing any situation. • Over time, people can adapt to most anything. Adaptation to a new situation consists in large part of thinking less and less about. The main exceptions are chronic pain, loud noise, and severe depression. The first two are part of biological signals that attract attention, while the latter involves the self-reinforcing cycle of negative thoughts. o I should remember for myself that people tend to adapt to new situations by thinking less and less about the nature of these situations. So, if I want to keep the situation more novel, if this is a situation which is pleasurable for me and they want to try my attention to this, I should think about the situation more than I would naturally. Likewise, remember that other people will tend to depth situations, and analyze any scenario according. • The word miswanting refers to bad choices that arise because of errors of affective forecasting. The focusing illusion is a rich source of this miswanting. For example, many people think they will enjoy purchasing a new car more than social activities with other people. However, after purchasing a new car, you think less and less about this purchase. However, you will always pay attention to the social interactions to which you committed yourself. Furthermore, an experience it is gaining a new skill will pay off long term if you practice that skill. The focusing illusion creates a bias in favor of goods and experiences that are initially exciting, even if they will eventually lose their appeal. Time is neglected, causing experiences that will retain their attention value in the long term to be appreciated less than they deserve to be. o For me, this implies the value of focusing more than I naturally would on goods and experiences that are initially less exciting but have long-term lasting value, as I know that these goods with long-term lasting value will be appreciated more by need over the long-term then I anticipate right now.

Conclusions • A central fact of our existence is the time is the ultimate limiting factor, but the remembering self ignores that reality. The neglect of duration combined with the peak-Andrew causes a bias that favors a short period of intense joy as opposed to long period of moderate happiness, and makes us here a short period of intense suffering more than we fear a much longer period of moderate pain. Duration neglect also makes us prone to accept a longer period of mild unpleasantness because the end will be better, and it favors giving up an opportunity for a long period of happiness if it is likely to have a poor ending. The logic of duration weighting is compelling, yet it cannot be the only determining factor because individuals identify with the remembering self and care about their own story. On the other hand, ignoring what actually happens in people’s lives and focusing exclusively on what they think about their life and remember about their life is also not tenable. The remembering self and the experiencing self must both be considered, because their interest to not always coincide, a question that matters not only for philosophers but also for policymakers. o I should remember for myself that the interests of the remembering self and experiencing self, which I am all at the same time, do not always coincide. I should remember that the remembering self is the one whose experiences will matter most, but the experiencing self also deserve significant consideration. Knowing this, I can most benefit from creating a life for myself in which the experiencing self has the best life, and the remembering self perceives this is the most optimal situation because it cares about the experiencing self and the memories formed by the experiencing self. • The book Nudge by Richard Thaler and Cass Sunstein promotes a position of libertarian paternalism, in which the state and other institutions are allowed and encouraged to knowledge people to make decisions that serve the long-term interests of the people themselves. This occurs by creating default options which are likeliest to benefit people. The default is generally presented as the norm, and deviating from the normal choices and active commission, which requires more effortful deliberation, responsibility, and is likelier to evoke regret rather than doing nothing. These are powerful forces which would guide people for not sure about what to do. Moreover, people need protection from others who deliberately exploit their weaknesses in their thinking and feeling. It is a good sign that many of the recommendations in knowledge have received pushback from firms whose profits might suffer if their customers were directed into default options that better serve the interests of the customers. o Based on this description of libertarian paternalism, I generally welcomed this position, because human beings are certainly not irrational agents, and have many weaknesses in their thinking and feeling. I think we need to draw an appropriate and careful line in how people are influenced by governments and other large institutions. At the same time, I need to remember that corporations are also large institutions influence people, and they generally do not have the best interests of people in mind, unlike governments, which are at least supposed to represent the population in democratic states. • System 1 registers the cognitive ease with which it put this information, but it does not generate a warning signal when becomes unreliable. Intuitive answers come to mind quickly and confidently, whether they originate from skills or from heuristic shortcuts. The only sure recourse is to slow down an attempt to reconstruct an answer using System 2, which is difficult because that system is lazy and it takes effort. o I should remember to check with myself whether the intuitive answers that come to mind are a result of my skill in a specific area or a heuristic shortcuts. I should also remember the importance of slowing down and turning on system two when analyzing the situation that I want to be sure about in getting the answers right. • The way to block errors at originate in system one is simple in principle, namely recognizing the signs that you are in a cognitive minefield, slowing down, and asking for reinforcement from system 2. Unfortunately, the sensible procedure is least likely to be applied went is needed most. Moreover, it is much easier to identify minefield when you observe others wondering into it than when you are about to do so yourself. o I should remember that it is most difficult to turn on System 2 when it is most needed, because I am distracted and emotional. I should remember to rely on friends and allies to help me turn on my system two, as it is much easier to see the cognitive mistakes made by others than by oneself. • Organizations are better than individuals when comes to avoiding errors, because they naturally process things more slowly and have the power to impose orderly procedures. Organizations can institute and forced application of useful checklists, and more elaborate exercises such as the premortem. Organizations can also encourage a culture in which people watch out for one another as they approach minefields. Organizations can produce the best decisions by carefully considering the framing of the problem that is to be solved, the collection of relevant information leading to a decision, and the reflection in the review. An organization that seeks to improve its decision product should routinely look for efficiency improvements in each of these. Constant quality control is an alternative to the wholesale reviews of processes that organizations commonly undertaken the wake of disasters. o I should remember this and apply it to organizations in which I participate and which I want to do well.