72 Highlights
We are prone to overestimate how much we understand about the world and to underestimate the role of chance in events.
We normally avoid mental overload by dividing our tasks into multiple easy steps, committing intermediate results to long-term memory or to paper rather than to an easily overloaded working memory.
System 1 has more influence on behavior when System 2 is busy, and it has a sweet tooth. People who are cognitively busy are also more likely to make selfish choices, use sexist language, and make superficial judgments in social situations.
Intelligence is not only the ability to reason; it is also the ability to find relevant material in memory and to deploy attention when needed.
Stanovich argues that high intelligence does not make people immune to biases. Another ability is involved, which he labels rationality.
As cognitive scientists have emphasized in recent years, cognition is embodied; you think with your body, not only with your brain.
Reciprocal links are common in the associative network. For example, being amused tends to make you smile, and smiling tends to make you feel amused.
The evidence of priming studies suggests that reminding people of their mortality increases the appeal of authoritarian ideas, which may become reassuring in the context of the terror of death.
A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth.
The familiarity of one phrase in the statement sufficed to make the whole statement feel familiar, and therefore true.
The general principle is that anything you can do to reduce cognitive strain will help, so you should first maximize legibility.
If you care about being thought credible and intelligent, do not use complex language where simpler language will do.
The mere exposure effect does not depend on the conscious experience of familiarity. In fact, the effect does not depend on consciousness at all: it occurs even when the repeated words or pictures are shown so quickly that the observers never become aware of having seen them.
Mood evidently affects the operation of System 1: when we are uncomfortable and unhappy, we lose touch with our intuition.
A happy mood loosens the control of System 2 over performance: when in a good mood, people become more intuitive and more creative but also less vigilant and more prone to logical errors.
Jumping to conclusions is risky when the situation is unfamiliar, the stakes are high, and there is no time to collect more information.
System 1 does not keep track of alternatives that it rejects, or even of the fact that there were alternatives. Conscious doubt is not in the repertoire of System 1.
System 1 is gullible and biased to believe, System 2 is in charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy. Indeed, there is evidence that people are more likely to be influenced by empty persuasive messages, such as commercials, when they are tired and depleted.
Sequence matters, however, because the halo effect increases the weight of first impressions, sometimes to the point that subsequent information is mostly wasted.
A simple rule can help: before an issue is discussed, all members of the committee should be asked to write a very brief summary of their position. This procedure makes good use of the value of the diversity of knowledge and opinion in the group. The standard practice of open discussion gives too much weight to the opinions of those who speak early and assertively, causing others to line up behind them.
Framing effects: Different ways of presenting the same information often evoke different emotions.
The technical definition of heuristic is a simple procedure that helps find adequate, though often imperfect, answers to difficult questions. The word comes from the same root as eureka.
The present state of mind looms very large when people evaluate their happiness.
In the context of attitudes, however, System 2 is more of an apologist for the emotions of System 1 than a critic of those emotions — an endorser rather than an enforcer. Its search for information and arguments is mostly constrained to information that is consistent with existing beliefs, not with an intention to examine them. An active, coherence-seeking System 1 suggests solutions to an undemanding System 2.
...we are prone to exaggerate the consistency and coherence of what we see. The exaggerated faith of researchers in what can be learned from a few observations is closely related to the halo effect, the sense we often get that we know and understand a person about whom we actually know very little.
Our predilection for causal thinking exposes us to serious mistakes in evaluating the randomness of truly random events.
We are far too willing to reject the belief that much of what we see in life is random.
Among the basic features of System 1 is its ability to set expectations and to be surprised when these expectations are violated.
The world in our heads is not a precise replica of reality; our expectations about the frequency of events are distorted by the prevalence and emotional intensity of the messages to which we are exposed.
“Risk” does not exist “out there,” independent of our minds and culture, waiting to be measured. Human beings have invented the concept of “risk” to help them understand and cope with the dangers and uncertainties of life. Although these dangers are real, there is no such thing as “real risk” or “objective risk.”
The combination of probability neglect with the social mechanisms of availability cascades inevitably leads to gross exaggeration of minor threats, sometimes with important consequences.
Democracy is inevitably messy, in part because the availability and affect heuristics that guide citizens’ beliefs and attitudes are inevitably biased, even if they generally point in the right direction.
There is a deep gap between our thinking about statistics and our thinking about individual cases. Statistical results with a causal interpretation have a stronger effect on our thinking than noncausal information.
In The Black Swan, Taleb introduced the notion of a narrative fallacy to describe how flawed stories of the past shape our views of the world and our expectations for the future. Narrative fallacies arise inevitably from our continuous attempt to make sense of the world.
A compelling narrative fosters an illusion of inevitability.
The human mind does not deal well with nonevents. The fact that many of the important events that did occur involve choices further tempts you to exaggerate the role of skill and underestimate the part that luck played in the outcome.
The mind that makes up narratives about the past is a sense-making organ. When an unpredicted event occurs, we immediately adjust our view of the world to accommodate the surprise.
A general limitation of the human mind is its imperfect ability to reconstruct past states of knowledge, or beliefs that have changed. Once you adopt a new view of the world (or of any part of it), you immediately lose much of your ability to recall what you used to believe before your mind changed.
Hindsight bias has pernicious effects on the evaluations of decision makers. It leads observers to assess the quality of a decision not by whether the process was sound but by whether its outcome was good or bad.
The sense-making machinery of System 1 makes us see the world as more tidy, simple, predictable, and coherent than it really is. The illusion that one has understood the past feeds the further illusion that one can predict and control the future. These illusions are comforting. They reduce the anxiety that we would experience if we allowed ourselves to fully acknowledge the uncertainties of existence.
Because of the halo effect, we get the causal relationship backward: we are prone to believe that the firm fails because its CEO is rigid, when the truth is that the CEO appears to be rigid because the firm is failing. This is how illusions of understanding are born.
“Let’s not fall for the outcome bias. This was a stupid decision even though it worked out well.”
The subjective experience of traders is that they are making sensible educated guesses in a situation of great uncertainty. In highly efficient markets, however, educated guesses are no more accurate than blind guesses.
The illusion that we understand the past fosters overconfidence in our ability to predict the future.
The idea that large historical events are determined by luck is profoundly shocking, although it is demonstrably true.
Meehl and other proponents of algorithms have argued strongly that it is unethical to rely on intuitive judgments for important decisions if an algorithm is available that will make fewer mistakes.
The situation has provided a cue; this cue has given the expert access to information stored in memory, and the information provides the answer. Intuition is nothing more and nothing less than recognition.
Herbert Simon
The proper way to elicit information from a group is not by starting with a public discussion but by confidentially collecting each person’s judgment.
The chances that a small business will survive for five years in the United States are about 35%. But the individuals who open such businesses do not believe that the statistics apply to them. A survey found that American entrepreneurs tend to believe they are in a promising line of business: their average estimate of the chances of success for “any business like yours” was 60% — almost double the true value.
The evidence suggests that optimism is widespread, stubborn, and costly.
An unbiased appreciation of uncertainty is a cornerstone of rationality — but it is not what people and organizations want. Extreme uncertainty is paralyzing under dangerous circumstances, and the admission that one is merely guessing is especially unacceptable when the stakes are high. Acting on pretended knowledge is often the preferred solution.
I have yet to meet a successful scientist who lacks the ability to exaggerate the importance of what he or she is doing, and I believe that someone who lacks a delusional sense of significance will wilt in the face of repeated experiences of multiple small failures and rare successes, the fate of most researchers.
Theory-induced blindness: once you have accepted a theory and used it as a tool in your thinking, it is extraordinarily difficult to notice its flaws.
The fundamental ideas of prospect theory are that reference points exist, and that losses loom larger than corresponding gains.
The brains of humans and other animals contain a mechanism that is designed to give priority to bad news.
The psychologist Paul Rozin, an expert on disgust, observed that a single cockroach will completely wreck the appeal of a bowl of cherries, but a cherry will do nothing at all for a bowl of cockroaches. As he points out, the negative trumps the positive in many ways, and loss aversion is one of many manifestations of a broad negativity dominance.
Many of the messages that negotiators exchange in the course of bargaining are attempts to communicate a reference point and provide an anchor to the other side.
Because of the possibility effect, we tend to overweight small risks and are willing to pay far more than expected value to eliminate them altogether.
Closely following daily fluctuations is a losing proposition, because the pain of the frequent small losses exceeds the pleasure of the equally frequent small gains.
The decision to invest additional resources in a losing account, when better investments are available, is known as the sunk-cost fallacy, a costly mistake that is observed in decisions large and small.
The sunk-cost fallacy keeps people for too long in poor jobs, unhappy marriages, and unpromising research projects.
Losses are weighted about twice as much as gains in several contexts: choice between gambles, the endowment effect, and reactions to price changes.
Hsee’s evaluability hypothesis: The number of entries is given no weight in single evaluation, because the numbers are not “evaluable” on their own.
Losses evokes stronger negative feelings than costs.
Our preferences are about framed problems, and our moral intuitions are about descriptions, not about substance.
Memories are all we get to keep from our experience of living, and the only perspective that we can adopt as we think about our lives is therefore that of the remembering self.
The experiencing self does not have a voice. The remembering self is sometimes wrong, but it is the one that keeps score and governs what we learn from living, and it is the one that makes decisions. What we learn from the past is to maximize the qualities of our future memories, not necessarily of our future experience. This is the tyranny of the remembering self.
Tastes and decisions are shaped by memories, and the memories can be wrong.
A story is about significant events and memorable moments, not about time passing. Duration neglect is normal in a story, and the ending often defines its character.
In intuitive evaluation of entire lives as well as brief episodes, peaks and ends matter but duration does not.
During the last ten years we have learned many new facts about happiness. But we have also learned that the word happiness does not have a simple meaning and should not be used as if it does. Sometimes scientific progress leaves us more puzzled than we were before.
The neglect of duration combined with the peak-end rule causes a bias that favors a short period of intense joy over a long period of moderate happiness.