PROBABILISTIC THINKING Karl Arnold Belser 17 December 2015 |

One of my sons sent me a link to the Long Now talk by Philip Tetlock titled SuperForcasting: Art and Science of Prediction. Since this talk will probably end up behind a pay wall, here is a link to the Transcript of a podcast called SuperForcasting from the blog Rationally Speaking. Tetlock's work has been implemented in The Good Judgment Project, and Tetlock's book SuperForcasting is considered by many to be similar in importance to Kahneman's book Thinking, Fast and Slow. Because I discuss topics about the Uncertain Future in this blog and because there are so many blog essays I gave the following comments to my son. =============== Statistics is not taught in many high schools even though it is probably more important and practical than algebra. For example, Bayes Theorem is in my opinion more important to human success than the Pythagorean Theorem. We make statistical decisions all the time and most people have no clue about base rate (a priori) probabilities and how to modify these chances with new information. O well. See my posts on the subject if you're interested. FALSE POSITIVE PROBABILITIES TAIL RISK However, statistics is not a complete mental model for dealing with uncertainty because of Black Swans, which are events that cannot be predicted or quantified with statistics. See my post on this topic: ANTI-FRAGILITY AND BEHAVIOR One of the reasons that I write my Uncertain Future blog is to find out what I think, how I think, and how to avoid thinking and behavioral errors. See my post on these errors. THINKING ERRORS ============= My main observation is that subjective statistics assume that the future occurrence of events can be modeled by a stable probability density function that is not obtained by accumulation of data from many repeated trials. These subjective probabilities are usually based on diverse data and the experience of the probability estimators. Tetlock is simply refining the estimation of these probabilities. His main point is that more information from people with diverse data and diverse experience give better probability estimates. In the discussion after the Long Now talk Tetlock says that Nacim Taleb, the author of Antifragile, was skeptical of Super Forcasting. Why? See my post Anti-Fragile for a discussion about what antifragility is. Simply stated there are things that are going to happen that are what Taleb calls Black Swans that are totally unexpected and unpredictable events, such as the caldera around Yellowstone park erupting and destroying billions of acres of land and darkening the sky for many years around the world. This is an extreme example, but lesser unexpected events (both good and bad) happen regularly. Taleb might argue that no person with any experience or with any data can model the occurrence of these black swan events with probabilities. Michael Mauboussin discusses what he calls the Paradox of Skill in his book The Success equation. See my post Luck and the Paradox of Skill. People with skill know how to apply probability theory whereas the really successful people have a lot of good luck. This luck, I think, is outside the realm of statistics. I think of it as a new discipline called anti fragility in which one might want to minimize the consequences of any type of extreme event. as in my post on Tail Risk above. Tetlock's Good Judgment project cannot account for Black Swan events, and it therefor will have limited success. I certainly do consider probabilities when I make my decisions. However, I also try to assess fragile situations so that i can't be wiped out by unexpected events or so that I might be able to take advantage of unexpected opportunities. My aphorism is: Luck is when opportunity meets a prepared mind. I want to be as lucky as possible. |