Viewing entries tagged
Feynman

You're not as smart as you think you are (and how you should use that to your advantage)

How often do you consider the accuracy of your statements (beyond a blanket true/false)?  It’s not a natural concept for most of us.  We don’t say, “I’m 73.9% sure we are out of milk at home.” When people are tasked to give these quantitative credence values to their statements, they are not very well calibrated.  That is, the credence values they give do not perfectly align with the average accuracy of their answers.  Understand that the lowest credence you can have is 50% - that is, there is a 50% chance you are right, and 50% chance you are wrong.  If you gave a 20% credence value, then it would be equivalent to saying there is an 80% chance you are wrong, which is also a type of accuracy.  This may seem a little counterintuitive, but imagine that every time I walked out of the grocery store, I would instinctively walk away from where my car was actually parked.  Therefore, in order to get to the car, you would just go the opposite direction I did.

Researchers found that people are under-confident at lower credence levels (around 50%) and over-confident at higher credence levels (close to 90-100%), the latter of which is dubbed overconfidence bias.  If someone tells you they are 100% certain, and they’re actually right 90% of the time, that could be a significant issue, particularly in life or death situations like major medical decisions.  This is easiest to see if you study this chart from Harvey (1997).

Confidence in Judgment - Harvey

Confidence in Judgment - Harvey

This chart shows that people are underconfident on easy tasks and overconfident on difficult tasks.  The straight line represents a perfectly calibrated person – their accuracy and credence are perfectly aligned.

Overconfidence is much more problematic than underconfidence for many reasons.  People tend to think they are better off than others, an effect well captured by Svenson (1981), who found that 93% of American drivers believe they are better than average drivers.  You don't need a degree in statistics to know that’s mathematically impossible!

Politicians love to tout their certainty that a policy or bill is guaranteed to succeed or fail.  But in reality, how can they be 100% sure?  The complex economic, political, and social structure of the world is constantly changing, and there are too many variables to be certain of anything.  At the same time, would people vote for a politician who campaigned with the slogan "I'm 87% sure I can fix the economy!"?  Likely not….but can't we find a middle ground between blind overconfidence and numerical credence estimates?

There are several implications of this in engineering.  Often in the design phase of a task, engineers are asked to rate the reliability of a system, or their confidence that a certain goal will be achieved.  These questions fall into the difficult (one might say System 2) realm in the above figure, and so we must confront the sobering fact that most of us are overconfident in our judgments (without even adding the pressure to impress a client or manager).  This is not the kind of thinking that we want going into designing our airplanes and cars!  Most people also underestimate how long it will take to finish a long-term task, an effect dubbed the planning fallacy.  Ever wonder why government projects often take longer and cost more than originally quoted?  A 2005 study by Flyvbjerg et al. showed that the average cost overrun of worldwide rail projects between 1969 and 1998 was 45%.  Political bickering and greed aside, humans are actually limited in our ability to predict the cost and timeline of long term projects.

A startling manifestation of overconfidence bias is illustrated in the Dunning-Kruger effect, which I cannot explain more eloquently than Wikipedia:

The Dunning–Kruger effect is a cognitive bias in which unskilled individuals suffer from illusory superiority, mistakenly rating their ability much higher than average. This bias is attributed to a metacognitive inability of the unskilled to recognize their mistakes.[1]

Actual competence may weaken self-confidence, as competent individuals may falsely assume that others have an equivalent understanding. David Dunning and Justin Kruger of Cornell University conclude, "the miscalibration of the incompetent stems from an error about the self, whereas the miscalibration of the highly competent stems from an error about others."[2]

So the next time you get into a heated political conversation with one of your ...erm…”unskilled” relatives, take solace in the fact that they cognitively can’t help themselves from believing that the Tea Party is the solution to the world’s problems, ignoring all logic to the contrary.  Though, I don’t recommend citing the Dunning-Kruger effect....it won’t diffuse the situation.

Fortunately, there are tools for becoming more calibrated in your judgments.  Andrew Critch, one of the graduate students (now PhD) involved with Sense and Sensibility and Science, developed a great game that allows you see how well calibrated your credence levels are through answering a series of trivia questions.  It’s important to understand that the goal of the game is not to get every question right; the goal of the game is to discern when you are right and when you are wrong. http://acritch.com/credence-game/ (And if anyone can guess the most calibrated profession, I'll give you a cookie!)

I’ll leave you with one positive effect of overconfidence that we dubbed scientific optimism in Sense and Sensibility and Science.  In the book Feynman’s Rainbow: A Search for Beauty in Physics and in Life by Leonard Mlodinow, Nobel Prize-winning physicist Richard Feynman is quoted saying that he motivated himself to solve problems by fooling himself into thinking that he had an edge over his competitors – some special ability that makes the problem solvable.

I know in my heart that it is likely that the reason is false, and likely the particular attitude I’m taking with it was thought of by others.  I don’t care; I fool myself into thinking I have an extra chance.  That I have something to contribute.  Otherwise I may as well wait for him to do it, whoever it is.

This of course isn't an original idea for those of us who read The Little Engine That Could with the famous train saying, "I-Think-I-Can-I-Think-I-Can…."

In the end, whether you are building a bridge, developing the Theory of Everything, or just answering a Trivial Pursuit question, be sure to ask yourself if you are falling for overconfidence bias, or rather if you should fool yourself into thinking you have an advantage over your compatriots.  It might just give you the push you need to solve a problem!

Sources:

Flyvbjerg, B., Skamris Holm, M. K., & Buhl, S. L. (2005). How (in) accurate are demand forecasts in public works projects?: The case of transportation. Journal of the American Planning Association, 71(2), 131-146.

Harvey, N. (1997). Confidence in judgment. Trends in cognitive sciences, 1(2), 78-82.

Svenson, O. (1981). Are we less risky and more skillful than our fellow drivers? Acta Psychologica, 47, 143-151.

http://en.wikipedia.org/wiki/Dunning–Kruger_effect