How we know what isin’t so by Thomas Gilovich

Deals with ways we form and maintain opinions that are flimsy and sometimes plain wrong – even in the face of overwhelming evidence. Some of these “facts” that we believe in are deleterious to our health and social standing. Still we continue to use them. A few examples follow

  • Belief that Vitamin C supplements decreases the severity and duration of cold.
  • Belief in holistic, new age, natural, alternative and even homeopathy.
  • Belief in ESP.
  • Belief that we are better than the average. Even professors fall into this trap – thinking that 95% of them are better than their average colleague.

The last one is especially intriguing. An explanation for it is the criteria which is used to decide who is better. Most of us think that the criteria on which we score high are better predictors of success. For example – a careful person gives higher rating to carefulness, a polite person to politeness and a saver to savings. When there is an ambiguous question where many criteria might be used, we choose the ones on which we are better and end up giving ourselves a higher than average score. On the other hand, when a specific criteria is mentioned this tendency is reduced.

Another very important source of bias is what Nassim Taleb calls “survivorship bias”. While watching “Game of Thrones” I was struck by this very apt analogy. The Stark family from the north always claims that “The winter is coming” and eventually they will be right. If we make a prediction and it fails then we find ways to forget them but if they succeed then we think of ourselves smart and a genius. Negative evidence is discounted while a positive one is given a lot of weight. Similarly, the saying that bad news always come in a group of three is particularly hard to disabuse oneself from. There is not clear time frame in which we can disprove it. If the third bad situation arises in 2 week, 2 months or 2 years – it is all taken as a positive. Or sometimes if it doesn’t than we drop this instance from our memory because it obviously does not fit the model.

All in all a very nice book. Although it is at times quite dense. It is entirely a matter of style – but the way the book is written, it is open to repetition. I do not know the best way to write it but that does not stop me from telling you what is wrong. Every chapter deals with a particular topic, for example – the belief in ESP. Then it goes on describing the biases that might play a role e.g., survivorship bias, confirmation bias and so on. Because these biases are quite common and are the source of all “mistakes” they tend to be repeated across chapters. A much better book in this respect is “Thinking, fast and slow” by Daniel Kahneman.

Bottom-line, I recommend this book but there are better books that deal with this topic. For example Bad Science by Ben Goldacre deals with health related issues in a good amount of detail and the book is actually a very fun read.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s