dw2

19 September 2010

Our own entrenched enemies of reason

Filed under: books, deception, evolution, intelligence, irrationality, psychology — David Wood @ 3:39 pm

I’m a pretty normal, observant guy.  If there was something as large as an elephant in that room, then I would have seen it – sure as eggs are eggs.  I don’t miss something as large as that.  So someone who says, afterwards, that there was an elephant there, must have some kind of screw loose, or some kind of twisted ulterior motivation.  Gosh, what kind of person are they?

Here’s another version of the same, faulty, line of reasoning:

I’m a pretty good police detective.  Over the years, I’ve developed the knack of knowing when people are telling the truth.  That’s what my experience has taught me.  I know when a confession is for real.  I don’t get things like that wrong.  So someone who says, afterwards, that the confession was forced, or that the criminal should get off on a technicality, must have some kind of screw loose, or some kind of twisted ulterior motivation.  Gosh, what kind of person are they?

And another:

I’m basically a moral person.  I don’t knowingly cause serious harm to my fellow human beings.  I don’t get things as badly wrong as that.  I’m not that kind of person.  So if undeniable evidence subsequently emerges that I really did seriously harm a group of people, well, these people must have deserved it.  They were part of a bad crowd.  I was actually doing society a favour.  Gosh, don’t you know, I’m one of the good guys.

Finally, consider this one:

I’m basically a savvy, intelligent person.  I don’t make major errors in reasoning.  If I take the time to investigate a religion and believe in it, I must be right.  All that investment of time and belief can’t have been wrong.  Perish the thought.  If that religion makes a prophecy – such as the end of the world on a certain date – then I must be right to believe it.  If the world subsequently appears not to have ended on that date, then it must have been our faith, and our actions, that saved the world after all.  Or maybe the world ended in an invisible, but more important way.  The kingdom of heaven has been established within. Either way, how right we were!

It can sometimes be fun to observe the self-delusions of the over-confident.  Psychologists talk about “cognitive dissonance”, when someone’s deeply held beliefs appear to be contradicted by straightforward evidence.  That person is forced to hold two incompatible viewpoints in mind at the same time: I deeply believe X, but I seem to observe not-X.  Most people are troubled by this kind of dissonance.  It’s psychologically uncomfortable.  And because it can be hard for them to give up their underlying self-belief that “If I deeply believe X, I must have good reasons to do so”, it can lead them into outlandish hoops and illogical jumps to deny the straightforward evidence.  For them, rather than “seeing is believing”, the saying becomes inverted: “believing is seeing”.

As I said, it can be fun to see the daft things people have done, to resolve their cognitive dissonance in favour of maintaining their own belief in their own essential soundness, morality, judgement, and/or reasoning.  It can be especial fun to observe the mental gymnastics of people with fundamentalist religious and/or political faith, who refuse to accept plain facts that contradict their certainty.  The same goes for believers in alien abduction, for fan boys of particular mobile operating systems, and for lots more besides.

But this can also be a deadly serious topic:

  • It can result in wrongful imprisonments, with the prosecutors unwilling to face up to the idea that their over-confidence was misplaced.  As a result, people spend many years of their life unjustly incarcerated.
  • It can result in families being shattered under the pressures of false “repressed memories” of childhood abuse, seemingly “recovered” by hypnotists and subsequently passionately believed by the apparent victims.
  • It can split up previously happy couples, who end up being besotted, not with each other, but with dreadful ideas about each other (even though “there’s always two sides to a story”).
  • Perhaps worst of all, it can result in generations-long feuds and wars – such as the disastrous entrenched enmity of the Middle East – with each side staunchly holding onto the view “we’re the good guys, and anything we did to these other guys was justified”.

Above, I’ve retold some of the thoughts that occurred to me as I recently listened to the book “Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts”, by veteran social psychologists Carol Tavris and Elliot Aronson.  (See here for this book’s website.)  At first, I found the book to be a very pleasant intellectual voyage.  It described, time and again, experimental research that should undermine anyone’s over-confidence about their abilities to observe, remember, and reason.  (I’ll come back to that research in a moment).  It reviewed real-life examples of cognitive dissonance – both personal examples and well-known historical examples.  So far, so good.  But later chapters made me more and more serious – and, frankly, more and more angry – as they explored horrific examples of miscarriages of justice (the miscarriage being subsequently demonstrated by the likes of DNA evidence), family breakups, and escalating conflicts and internecine violence.  All of this stemmed from faulty reasoning, brought on by self-justification (I’m not the kind of person who could make that kind of mistake) and by over-confidence in our own thinking skills.

Some of the same ground is covered in another recent book, “The invisible gorilla – and other ways our intuition deceives us”, by Christopher Chabris and Daniel Simons.  (See here for the website accompanying this book.)  The gorilla in the title refers to the celebrated experiment where viewers are asked to concentrate on one set of activity – counting the number of passes made by a group of basketball players – and often totally fail to notice someone in a gorilla suit wandering through the crowd of players.  Gorilla?  What gorilla?  Don’t be stupid!  If there had been a gorilla there, I would have seen it, sure as eggs are eggs.

Chapter by chapter, “The invisible gorilla” reviews evidence that we tend to be over-confident in our own abilities to observe, remember, and reason.  The chapters cover:

  • Our bias to think we would surely observe anything large and important that happened
  • Our bias to think our memories are reliable
  • Our bias to think that people who express themselves confidently are more likely to be trustworthy
  • Our bias to think that we would give equal weight to evidence that contradicts our beliefs, as to evidence that supports our beliefs (the reality is that we search high and low for confirming evidence, and quickly jump to reasons to justify ignoring disconfirming evidence)
  • Our bias to think that correlation implies causation: that if event A is often followed by event B, then A will be the cause of B
  • Our bias to think there are quick fixes that will allow significant improvements in our thinking power – such as playing classical music to babies (an effect that has been systematically discredited)
  • Our bias to think we can do many things simultaneously (“multi-task”) without any individual task being affected detrimentally.

These biases probably all were useful to Homo sapiens at an early phase of our evolutionary history.  But in the complex society of the present day, these biases do us more harm than good.

Added together, the two books provide sobering material about our cognitive biases, and about the damage that all too often follows from us being unaware of these biases.

“Mistakes were made (but not by me)” adds the further insight that we tend to descend gradually into a state of gross over-confidence.  The book frequently refers to the metaphor of a pyramid.  Before we make a strong commitment, we are often open-minded.  We could go in several different directions.  But once we start down any of the faces in the pyramid, it becomes harder and harder to retract – and we move further away from people who, initially, were in the very same undecided state as us.  The more we follow a course of action, the greater our commitment to defend all the time and energy we’ve committed down that path.  I can’t have taken a wrong decision, because if I had, I would have wasted all that time and energy, and that’s not the kind of person I am. So they invest even more time and energy, walking yet further down that pyramid of over-confidence, in order to maintain their own self-image.

At root, what’s going wrong here is what psychologists call self-justification.  Once upon a time, the word pride would have been used.  We can’t bear to realise that our own self-image is at fault, so we continue to take actions – often harmful actions – in support of our self-image.

The final chapters of both books offer hope.  They give examples of people who are able to break out of this spiral of self-justification.  It isn’t easy.

An important conclusion is that we should put greater focus on educating people about cognitive biases.  Knowing about a cognitive bias doesn’t make us immune to it, but it does help – especially when we are still only a few rungs down the face of the pyramid.  As stated in the conclusion of “The invisible gorilla”:

One of our messages in this book is indeed negative: Be wary of your intuitions, especially intuitions about how your own mind works.  Our mental systems for rapid cognition excel at solving the problems they evolved to solve, but our cultures, societies, and technologies today are much more complex than those of our ancestors.  In many cases, intuition is poorly adapted to solving problems in the modern world.  Think twice before you decide to trust intuition over rational analysis, especially in important matters, and watch out for people who tell you intuition can be a panacea for decision-making ills…

But we also have an affirmative message to leave you with.  You can make better decisions, and maybe even get a better life, if you do your best to look for the invisible gorillas in the world around you…  There may be important things right in front of you that you aren’t noticing due to the illusion of attention.  Now that you know about this illusion, you’ll be less apt to assume you’re seeing everything there is to see.  You may think you remember some things much better than you really do, because of the illusion of memory.  Now that you understand this illusion, your trust your own memories, and that of others, a bit less, and you’ll try to corroborate your memory in important situations.  You’ll recognise that the confidence people express often reflects their personalities rather than their knowledge, memory, or abilities…  You’ll be skeptical of claims that simple tricks can unleash the untapped potential in your mind, but you’ll be aware than you can develop phenomenal levels of expertise if you study and practice the right way.

Similarly, we should also take more care to widely explain the benefits of the scientific approach, which searches for disconfirming evidence as must as it searches for confirming evidence.

That’s the pro-reason approach to encouraging better reasoning.  But reason, by itself, often isn’t enough.  If we are going to face up to the fact that we’ve made grave errors of judgement, which have caused pain, injustice, and sometimes even death and destruction, we frequently need powerful emotional support.  To enable us to admit to ourselves that we’ve made major mistakes, it greatly helps if we can find another image of ourselves, which sees us as making better contributions in the future.  That’s the pro-hope approach to encouraging better reasoning.  The two books have examples of each approach.  Both books are well worth reading.  At the very least, you may get some new insight as to why discussions on Internet forums often descend into people seemingly talking past each other, or why formerly friendly colleagues can get stuck into an unhelpful rut of deeply disliking each other.

Leave a Comment »

No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Blog at WordPress.com.

%d bloggers like this: