Leisure / Psychology / Work

Fallacies: A Sample of Ways Our Thinking Is Not Helpful

At a certain point, and to a certain degree, there is the promise of disenchantment in humanity for anyone reading a book on cognition and social psychology.  If you read between the lines you will notice a point where the author takes a break from describing many ailments our thinking suffers from, and the crutches that it employs in result, and seem to say something to the effect of, “Don’t act so shocked. This explains a lot, doesn’t it. Yes, you know it does.” Well, before resigning ourselves to reluctantly accepting our extremely substandard habits of thinking, perhaps we should celebrate them instead. Below are my favourite mistakes in logic that we make as humans.

On a more serious note, two disclaimers are in order. First, “fallacies” can instead be called “heuristics.” Our mind takes shortcuts when making choices and judgments to improve efficiency by reducing information processing time. These fallacies (heuristics) take us to a decision or solution most favoured by evolution. Therefore, even if they make us irrational in a modern context, they got us at least this far as a species. Second, fallacies are actually great tools because when we know about them, we can carefully work to fix the errors that they sometimes cause.  Stopping an old way of thinking is easier than instating a new way of thinking, which is why recognizing fallacies are a great first step to self-improvement.

The main reference for this list is the stellar book The Art of Thinking Clearly by Rolf Dobelli. It was also the subject of my previous post on a comparison of Dobelli’s and Kerouac’s books.

(1)  Sunk cost fallacy. This is a common concept to economists and psychologists. Because we get attached to all kinds of things as humans (for instance books), we are often reluctant to let them go, using as justification the resources that went into the thing (the book is awful but you’re already halfway though). We need to stop this. Today, there are so many opportunities for worthwhile things in which to invest our time. Let’s learn to let go of things regardless of past efforts.

(2)  Anchors. This fallacy is the most mind-boggling of the bunch. Did you know that if you were to draw a random number from 1 to 100 and then be asked a question that requires you to come up with a percentage figure (for example, what percentage of the population of the US lives in rural areas?), you will guess close to the number you drew? Directionally, of course: if you drew a 10 your number will be significantly lower than if you drew a 70.  This is something that affects us professionally as well. If we are given a figure and we are asked to prove or disprove it, we will begin by working off of that figure. Something to be aware of but apparently something that can’t be “fixed.” Studies have shown that manipulating a listing price of a home affects a real estate agent’s best evaluation of the true market value of the home – even when they are explicitly told to ignore the listing price.

(3)  Base rate neglect. Novels are more appealing than statistics, for good reason. We need a scenario and a coherent story to buy into an idea. Because of this, we tend to ignore statistics (which represent the big picture) and believe narratives (which represent a tiny portion of the big picture, and rationally should be inconclusive). For instance, a person begins to feel unwell on a street corner and asks for help. You are told that 70% of the passersby did nothing and 30% stopped to help (I made that up). However, if you were then shown pictures and profiles of random individuals that passed by that street corner that day, and were asked, “Do you think this person stopped to help?” you would be likely to answer affirmatively to most of the profiles. They all seem like nice enough people, so they would stop to help. This is wrong – if you were shown 10, you should only choose 3 that would stop to help. But in making judgments like these we ignore base rates, which should influence our decisions much more heavily, unless there is definitive proof to stray away from them. Therefore, to make a good decision: assume averages, and move from there based only on salient evidence.

(4)  Action bias. In his book, Dobelli aptly calls this chapter “Why Watching and Waiting is Torture.” If there is an outcome we want to achieve, the person who seems busiest will be judged as most likely to attain that outcome. For instance, in a university library during exam season the student with the most notebooks and highlighters, hunched over the textbook with a furrowed expression, will come across as more likely to get a high mark than a student relaxed with a few pages of notes.  This is simply not true.  We judge the action before the outcome. Another example, this one from the book: a soccer player has three choices in a penalty shot, to shoot right, centre, or left. The goalkeeper who does not lunge to either side has as much of a chance to block the ball as one who does. However, he will be judged as a better goalie if he lunges – regardless of the outcome. He took action. This is something to be aware of professionally and academically as well. Are you focusing merely on the appearance of action, or on the final result?

The opposite can be true as well. When an active or inactive scenarios lead to the same, terrible outcome, we judge the active scenario as the more immoral one. We tend to go way too easy on the person who took the inactive course of action, even if their inactivity meant this problematic outcome. For instance: Breaking Bad fans who have finished watching Season 2 will know that in Season 2 Episode 11 Walter White takes a course of inactivity in a specific situation that leads to a certain outcome. I am going to keep this absolutely spoiler free. But think about it – was his inaction less immoral because it was not active? How much worse would it have been, really, if he took an active course of action to lead to the same result? Word of warning, if you choose to discuss this with friends, be prepared for heated debate.

Didn’t make the cut, but also interesting to consider:

(5)  Liking bias. We like people who like us back. Take note. Again, evolution is largely responsible.

(6)  Fear of regret. We overestimate how damaging any lost opportunity would be in most cases. Cue “last chance to save” sales.

(7)  Exponential growth. Studies have proven again and again that human beings do not have an intuitive grasp of statistical concepts, especially exponential growth. To add to this, we feel much more confident in our abilities to understand growth than we should. (Proof: how many times do you need to fold a sheet of paper in half so that its thickness reaches the moon? 42.) Anytime that you are presented with growth rates as a piece of information in a decision making situation, the first thing to do is get out a pen, paper, and a calculator.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s