Introduction
The hope for informed gossip is that
there are distinctive patterns in the errors people make. Systematic errors are
known as biases, and they recur predictably in particular circumstances. When
the handsome and confident speaker bounds onto the stage, for example, you can
anticipate that the audience will judge his comments more favourably than he
deserves. The availability of a diagonistic label for this bias-the halo
effect-makes it easier to anticipate, recognize, and understand.
Most impressions and thoughts arise in
your conscious experience without your knowing how they got there. The mental
work that produces impressions, intuitions, and many decisions goes on in
silence in our mind.
Much of the discussion in this book is
about biases of intuition. As we navigate our lives, we normally allow
ourselves to be guided by impressions and feelings, and the confidence we have in
our intuitive beliefs and preferences is usually justified. But not always. We
are often confident even when we are wrong, and an objective observer is more
likely to detect our errors than we are.
So this is the authors aim: improve the
ability to identify and understand errors of judgement and choice, in others
and eventually in ourselves, by providing a richer and more precise language to
discuss them. In at least some cases, an accurate diagnosis may suggest an
intervention to limit the damage that bad judgements and choices often cause.
Origins
This book presents the authors current
understanding of judgement and decision making, which has been shaped by
psychological discoveries of recent decades. However, he traces the central
ideas to the lucky day in 1969 when he asked a colleague, Amos Tversky to speak
as a guest to a seminar. His colleague talked about-Are people good intuitive
statisticians? He reported that the answer was a qualified yes. They had a
lively debate in the seminar and ultimately concluded that a qualified no was a
better answer.
Amos and he enjoyed the exchange and
concluded that intuitive statistics was an interesting topic and would be fun
to explore it together.
They found that expert colleagues also,
like them, greatly exaggerated the likelihood that the original result of an
experiment would be successfully replicated even with a small sample. They also
gave very poor advice to a fictitious graduate student about the number of
observations she needed to collect. Even statisticians were not good intuitive
statisticians.
A theory was emerging in their mind
about the role of resemblance in
predictions, they tested the theory for example by taking an illustration of a
person with characteristics that resembled a stereotypical li9brarian, and
found that most people thought of him as that, they ignored other statistics,
such as, there are more farmers in US and used resemblance as a simplifying
heuristic to make a difficult judgement, this caused predictable biases in
their predictions.
People judged the size of catewgoriesby
the ease with which instances came to their mind-the availability
characteristic
Social scientists of the 70s assumed
people are rational and they depart from rationality due to emotions, Amos and
Daniel documented systematic errors in the thinking of normal people and traced
these errors to the design of machinery of cognition rather than corruption
caused by emotion.
The full text of questions asked to
respondents was included by the authors in their articles, serving as demonstrations of cognitive biases to the
rewaders.
Then they shifted to decision making
under uncertainty:Prospect Theory.
Where
we are now:
Our everyday intuitive abilities are no
less marvelous than the striking insights of an experienced fire fighter or
physician-only more common.According to the great Herbert Simon: “Intuition is
nothing more or less than recognition”
The essence of intuitive heuristics is
that when we are faced with a difficult question, we often answer an easier one
instead, usually without noticing the substitution.
Spontaneous search often fails, leading
to more deliberate thinking.
The author describes mental life by the
metaphor of two agents, System1 and 2, fast and slow thinking.
What
comes next in the book:
Part 1:Two systems approach to
judgement and choice. Automatic operations of System 1 of thinking and
controlled operations of System 2.
Part 2 explains that Statistics
requires thinking of many things at once which System 1 is not designed to do.
Part 3 describes how overconfidence is
fed by the illusory certainty of hindsight.
Part 4 describes the the unfortunate
tendency to treat problems in isolation and with framing effects.
Part 5 describes two selves,
experiencing self and remembering self, and how we can exploit memories, so a
more painful experience leaves a better memory from among two painful
experiences.
A concluding chapter explores
implications of three distinctions, experiencing and remembering self, agents
in classical and behavioral economics and automatic System1 and effortful
system 2.
No comments:
Post a Comment