“Their judgment was based more on wishful thinking than on a sound calculation of probabilities; for the usual thing among men is that when they want something they will, without any reflection, leave that to hope, while they will employ the full force of reason in rejecting what they find unpalatable.”
— Thucydides, in History of the Peloponnesian War
From Stalking the Black Swan: Research and Decision Making in a World of Extreme Volatility
When new information conflicts with our preexisting hypotheses, we have a problem that needs to be resolved. Cognitive dissonance refers to the state of tension that occurs when a person holds two ideas, beliefs, attitudes, or opinions that are psychologically inconsistent. This conflict manifests itself as a state of mental tension or dissonance, the intensity of which is visible in magnetic resonance imaging studies of the brain. The theory was developed in 1957 by Leon Festinger, who observed in a series of experiments that people would change their attitudes to make them more consistent with actions they had just taken. In popular usage, cognitive dissonance refers to the tendency to ignore information that conflicts with preexisting views, to rationalize certain behaviors to make them seem more consistent with self-image, or to change attitudes to make them consistent with actions already taken. In some cases, it is the equivalent of telling ourselves “little while lies,” but in other cases it no doubt contributes to logical errors like the “confirmation trap,” where people deliberately search for data to confirm existing views rather than challenge them.
Two major sources of cognitive dissonance are self-image (when the image we hold of ourselves is threatened) and commitment (when we’ve said something, we don’t want to be criticized for changing our minds).
“Cognitive dissonance,” writes Ken Posner, “may mainfest itself in a phenomenon known as change blindness. According to behavioral researches”:
change blindness is a situation where people fail to notice change because it takes place slowly and incrementally. It is also called the “boiling frog syndrome,” referring to the folk wisdom that if you throw a frog in boiling water it will jump out, but if you put it into cold water that is gradually heated, the frog will never notice the change. Most of the studies in this area focus on difficulties in perceiving change visually, but researchers think there is a parallel to decision making.
“Change blindness,” Posner continues, “happens when we filter out the implications of new information rather than assigning them even partial weight in our thinking.”