The only way to substantiate a belief is to try to disprove it. Irrationality: The Enemy Within page Sutherland was 65 when he wrote this book, and nearing the end of a prestigious career in psychology research. By the end of the book, Sutherland claims to have defined and demonstrated over distinct cognitive errors humans are prone to p. They reminded me of the lists of weird and wonderful Christian heresies I was familiar with from years of of reading early Christians history. I have witnessed at first hand the utter irrationality of small and medium-sized children; and I have seen so many examples of corporate conformity, the avoidance of embarrassment, unwillingness to speak up, deferral to authority, and general mismanagement in the civil service that, upon rereading the book, hardly any of it came as a surprise.
|Published (Last):||19 June 2006|
|PDF File Size:||15.42 Mb|
|ePub File Size:||20.21 Mb|
|Price:||Free* [*Free Regsitration Required]|
The only way to substantiate a belief is to try to disprove it. Irrationality: The Enemy Within page Sutherland was 65 when he wrote this book, and nearing the end of a prestigious career in psychology research. By the end of the book, Sutherland claims to have defined and demonstrated over distinct cognitive errors humans are prone to p. They reminded me of the lists of weird and wonderful Christian heresies I was familiar with from years of of reading early Christians history.
I have witnessed at first hand the utter irrationality of small and medium-sized children; and I have seen so many examples of corporate conformity, the avoidance of embarrassment, unwillingness to speak up, deferral to authority, and general mismanagement in the civil service that, upon rereading the book, hardly any of it came as a surprise.
Rational thinking is most likely to lead to the conclusion that is correct, given the information available at the time with the obvious rider that, as new information comes to light, you should be prepared to change your mind. Rational action is that which is most likely to achieve your goals. But in order to achieve this, you have to have clearly defined goals.
Few people think hard about their goals and even fewer think hard about the many possible consequences of their actions. As part of his Philosophy A-Level my son was given a useful handout with a list of about fifty logical fallacies i. But logical fallacies are not the same as cognitive biases.
A logical fallacy stems from an error in a logical argument; it is specific and easy to identify and correct. Cognitive bias derives from deep-rooted, thought-processing errors which themselves stem from problems with memory, attention, self-awareness, mental strategy and other mental mistakes.
Cognitive biases are, in most cases, far harder to acknowledge and often very difficult to correct. Thus most people are not intellectually equipped to understand the most reliable type of information available to human beings — data in the form of numbers.
Instead they tend to make decisions based on a wide range of faulty and irrational psychological biases. At a more disruptive level, people might be alcoholics, drug addicts, or prey to a range of other obsessive behaviours, not to mention suffering from a wide range of mental illnesses or conditions which undermine any attempt at rational decision-making, such as stress, anxiety or, at the other end of the spectrum, depression and loss of interest.
The functional limits of consciousness Numerous experiments have shown that human beings have a limited capacity to process information. Given that people rarely have a a sufficient understanding of the relevant statistical data to begin with, and b lack the RAM capacity to process all the data required to make the optimum decision, it is no surprise that most of us fall back on all manner of more limited, non-statistical biases and prejudices when it comes to making decisions.
Humans want to feel safe, secure, calm, and in control. This is fair enough, but it does mean that people have a way of blocking out any kind of information which threatens them. Jumping to conclusions before we have enough evidence is a basic and universal error. Many errors are due to people reaching for the most available explanation , using the first thing that comes to mind, and not taking the time to investigate further and make a proper, rational survey of the information.
Many experiments show that you can unconsciously bias people by planting ideas, words or images in their minds which then directly affect decisions they take hours later about supposedly unconnected issues.
The news media is hard-wired to publicise shocking and startling stories which leads to the permanent misleading of the reading public. One tourist eaten by a shark in Australia eclipses the fact that you are far more likely to die in a car crash than be eaten by a shark. Experimenters read out a list of men and women to two groups without telling them that there are exactly 25 men and 25 women, and asked them to guess the ratio of the sexes.
If the list included some famous men, the group was influenced to think there were more men, if the list included famous women, the group thought there are more women than men. The prominence effect. The entire advertising industry is based on the availability error in the way it invents straplines, catchphrases and jingles designed to pop to the front of your mind when you consider any type of product, making those products — in other words — super available.
I liked the attribution of the well-known fact that retailers price goods at just under the nearest pound, to the availability error. It is more available. Numerous studies have shown that the availability error is hugely increased under stress. Under stressful situations — in an accident — people fixate on the first solution that comes to mind and refuse to budge. The primacy effect First impressions. Interviewers make up their minds about a candidate for a job in the first minute of an interview and then spend the rest of the time collecting data to confirm that first impression.
Two groups were asked to estimate whether the population of Turkey was a bigger than 5 million b less than 65 million, and what it was. They were both wrong. It is 80 million. The halo effect People extrapolate the nature of the whole from just one quality e. The halo effect is fundamental to advertising, which seeks to associate images of beautiful men, women, smiling children, sunlit countryside etc with the product being marketed. The existence of the halo effect and primacy effect are both reasons why interviews are a poor way to assess candidates for jobs or places.
In an experiment examiners were given identical answers, but some in terrible handwriting, some in beautifully clear handwriting. The samples with clear handwriting consistently scored higher marks, despite the identical factual content of the scripts. This explains a good deal of racial prejudice: a immigrants stand out b a handful of immigrants commit egregious behaviour — therefore it is a classic example of illusory correlation to associate the two. What is missing is taking into account all the negative examples i.
Pay attention to negative cases. Stereotypes 1. People tend to notice anything which supports their existing opinions. Projection People project onto neutral phenomena, patterns and meanings they are familiar with or which bolster their beliefs. This is compounded by —. Aka pig-headedness. And this is axacerbated by —. Groups are more likely to make irrational decisions than individuals are. When asked to do a quiz at the end of the session, participants showed a marked tendency to remember the expected adjective, and forget the unexpected one.
We marry people who share our opinions, we have friends with people who share our opinions, we agree with everyone in our circle on Facebook. Self-serving biases When things go well, people take the credit, when things go badly, people blame external circumstances.
And rather than make a mistake in front of others, people would rather keep quiet and say nothing in a meeting situation or do nothing, if everyone else is doing nothing in an action situation. Both of these avoidances feed into —. Obedience The Milgram experiment proved that people will carry out any kind of atrocity for an authoritative man in a white coat. All of his students agreed to inflict life-threatening levels of electric shock on the victim, supposedly wired up in the next door room and emitting blood curdling faked screams of pain.
Obedience is behaving in a way ordered by an authority figure. Conformity is behaving in a way dictated by your peers. To your amazement, everyone else in the room chooses a line which is obviously wildly wrong. Nobody likes to admit especially to themselves that they are useless at taking decisions. Our inability to acknowledge our own errors even to ourselves is one of the most fundamental causes of irrationality.
People tend to seek confirmation of their current hypothesis, whereas they should be trying to disconfirm it. Subjects in an experiment watched two people holding an informal quiz: the first person made up questions based on what he knew and asked the second person who, naturally enough, hardly got any of them right.
We are quick to personalise and blame in a bid to turn others into monolithic entities which we can then define and control — this saves time and effort, and makes us feel safer and secure — whereas the evidence is that all people are capable of a wide range of behaviours depending on the context and situation.
One attribute colours your view of a more complex whole. If someone else screws up, it is because they just are thick, lazy, useless. False Consensus Effect Over-confidence that other people think and feel like us, that our beliefs and values are the norm — in my view one of the profound cultural errors of our time. For liberals, the correctness of their opinions — on universal health care, on Sarah Palin, on gay marriage — is self-evident. Anyone who has tried to argue the merits of such issues with liberals will surely recognize this attitude.
Liberals are pleased with themselves for thinking the way they do. In their view, the way they think is the way all right-thinking people should think. On matters of books and movies, they may give an inch, but if people have contrary opinions on political and social matters, it follows that the fault is with the others. Commentary magazine. If the project is a success, it was all due to my hard work and leadership.
You can see why this has an important evolutionary and psychological purpose. Our prototype is what we think is the most relevant or typical example of a particular event or object. This often happens around notions of randomness: people have a notion of what randomness should look like i. But in fact plenty of random events or sequences arrange themselves into patterns we find meaningful.
Ask a selection of people which of these three sets of six coin tosses where H stands for heads, T for tails is random. But of course all three are equally likely or unlikely. Hindsight In numerous experiments people have been asked to predict the outcome of an event, then after the event questioned about their predictions. Most people forget their inaccurate predictions and misremember that they were accurate. Overconfidence Most professionals have been shown to overvalue their expertise i.
This is because the twin areas of Probability and Statistics are absolutely fraught with difficulty. Either you have been taught the correct techniques, and understand them, and practice them regularly and both books demonstrate that even experts make terrible mistakes in the handling of statistics and probability or, like most of us, you have not. Since we live in a society whose public discourse i.
My suggestion that mistakes in handling statistics are not really the same as unconscious cognitive biases, applies even more to the world of gambling. Gambling is a highly specialised and advanced form of probability applied to games which has been pored over by very clever people for centuries.
A practical point that emerges from the examples is:. The only way to avoid the false conclusions that follow from that is to draw a 2 x 2 box and work through the figures.
Here is a table of 1, women who had a mammogram because their doctors thought they had symptoms of breast cancer. Bearing in mind that a conditional probability is saying that if X and Y are linked, then the chances of X, if Y, are so and so — i.
But this is wrong.
Reason to be cheerful
The implicit premise behind this column is that, each week, I say in effect: "Have a look at this, it's well worth reading. First published in , Irrationality proposes, and to any reasonable mind proves, that we are for the most part credulous fools who would do well, in most circumstances, to stop and think before we go and do something stupid; for stupid things are what we often end up doing, however much we congratulate ourselves on being rational animals. The book's conclusions would appear to be just as valid in as they were 15 years ago. Not that it is actually grim or depressing.
Irrationality : The Enemy within