THINKING ERRORS
By Karl Arnold Belser
14 October 2013




The most serious thinking issue is the Dunning-Kruger Effect in which a person doesn't know that they don't know. This lack of self-knowledge is not black and white, but rather a kind of biased thinking about or estimation of one's abilities.

The heart of the matter is the two types of thinking processes that 
Daniel Kehneman describes in his book Thinking, Fast and Slow, System 1 for reflexive thinking and System 2 for conscious calculation. Kehneman points out that most people operate reflexively and actually avoid and do not enjoy conscious thinking. These System 1 errors are Cognitive Biases. These  biases are subtle because one generally is not aware that a thinking error is occurring.

I think that the method for Reducing Cognitive Biases is to observe that most biases are reflexive and hence unconscious. If one can recognize situations in which a particular bias might occur, one might be able to become conscious (that is, to use System 2) and consciously consider what to do or say.

In short, I want to clean up my act so that I don't make obvious thinking errors.

My method in this article is to read and consider the lists of errors given in Wikipedia as follows:

List of Common Misconceptions


I have put the List of Misconceptions first to remind me to think based on real data or from first principles, which I know to be true.

The Black Swan

I have put the book titled The Black Swan next just because of the importance of the narrative fallacy. The mind almost  always makes up a story to explain the data so that the data can be remembered. Often the story corrupts the data because it is not true.'

List of Fallacies

I put the List of Fallacies next because they are simply common  thinking errors that should be habitually avoided.

List of Cognitive Biases

The list of cognitive biases is
both long and daunting. Hence, I will discuss a few biases from the Decision-Making, Belief and Behavioral Biases sub-list that I think might effect my investing behavior and in the big picture my ability to survive Black Swan events.. I am not going to discuss misconceptions or fallacies. These can be studied separately.


1. Status quo bias


This is the tendency to favor things the way they are. The argument "If it ain't broke, don't fix it" is a central idea. I know that as an older person I don't have the energy to make changes even though they might improve things a bit. I am afraid of the  unexpected consequences. The imperative is: "Think Outside the Box".

2. Normalcy bias

This is the denial that unexpected things (low probability events) might happen. The person makes no effort to avoid all together or to become robust to such events. This is what the turkey might think until the day he becomes Thanksgiving dinner, the Parable of the Turkey. The imperative is: "Hope for the best and prepare for the worst".

3. Bandwagon effect

This is the tendency to act or think like the group. I describe this kind of thinking error in Avalanche-Like Behavior.

4. Attention bias

This is the tendency for a person to focus on what he or she thinks about the most. The old adage that "If all you have is a hammer, everything looks like a nail" sums up this effect. It is also possible that the media might focus attention inappropriately as I describe in Attention Economics.  The imperative is: "Look for data and avoid the hype".

5.  Framing Bias

Decision-making tends to be influenced when the stated outcome as a gain or when the stated outcome as a loss. 
Daniel Kehneman received the Nobel Prize in Economics for quantifying this effect in his work called Prospect Theory. Prospect Theory disproved the Efficient-Market Hypothesis.

6.  Anchoring Bias

This is the tendency for people to use the first piece of information as a reference and then to adjust future decision relative to this anchor.

7. Belief bias

The tendency for an individual to judge a proposition based on the proposition's plausibility, not on observed data. The imperative is to ask "What data supports the proposition, and is the proposition falsifiable?"


8. Confirmation Bias

This is the tendency to look for interpretations and to remember information that confirms what one believes. The warning is:  "Correlation does not prove causation".

9. Clustering illusion

This is the tendency to see patterns in small amounts or short runs of data. The imperative is: "Make sure the sample size is large enough to get a good statistical average".

10. Hyperbolic Discount Bias

This is the tendency for a person to prefer a reward now versus a bigger reward later. This tendency may be why some investors do not hold their securities for long time periods. The result is a smaller profit. Warren Buffett avoids this problem by saying that "My favorite holding period is forever".

11. Ludic fallacy

It is not possible to predict the future with mathematical modeling. Hence, a game is not a good metaphor for real life situations. There will always be significant unexpected events that will dramatically change the future.

12. Impact bias

The tendency to over-estimate future feelings or emotional states as the consequence of a current action. E.G. would more money or a new car now make me happier tomorrow? Probably not.

13. Restraint bias

This is the tendency for people to over-estimate their ability to have self-control.  This effect is especially bad when a person believes that they have excellent self-control, which is a good example of the Dunning-Kruger Effect. The imperative is: Keep reminding yourself "I don't really have as much self-control as I think I do".

   
Last updated November 30, 2013
KARL BELSER HOME PAGE
HTML 4.01