We have an instinctual desire to remain consistent with our prior actions and beliefs. This can lead us to behave in irrational ways. Here’s how the commitment and consistency fallacy can lead us to do things that aren’t always in our best interest.
“The difficulty lies not in the new ideas,
but in escaping the old ones, which ramify,
for those brought up as most of us have been,
into every corner of our minds.”
— John Maynard Keynes
Ben Franklin tells an interesting little story in his autobiography. Facing opposition to being reelected Clerk of the General Assembly, he sought to gain favor with the member so vocally opposing him:
Having heard that he had in his library a certain very scarce and curious book, I wrote a note to him, expressing my desire of perusing that book, and requesting he would do me the favour of lending it to me for a few days. He sent it immediately, and I return’d it in about a week with another note, expressing strongly my sense of the favour.
When we next met in the House, he spoke to me (which he had never done before), and with great civility; and he ever after manifested a readiness to serve me on all occasions, so that we became great friends, and our friendship continued to his death.
This is another instance of the truth of an old maxim I had learned, which says, “He that has once done you a kindness will be more ready to do you another, than he whom you yourself have obliged.”
The man, having lent Franklin a rare and valuable book, sought to stay consistent with his past actions. He wouldn’t, of course, lend a book to an unworthy man, would he?
Positive Self Image
Scottish philosopher and economist Adam Smith said in The Theory of Moral Sentiments:
The opinion which we entertain of our own character depends entirely on our judgments concerning our past conduct. It is so disagreeable to think ill of ourselves, that we often purposely turn away our view from those circumstances which might render that judgment unfavorable.
Even when it acts against our best interest our tendency is to be consistent with our prior commitments, ideas, thoughts, words, and actions. As a byproduct of confirmation bias, we rarely seek disconfirming evidence of what we believe. This, after all, makes it easier to maintain our positive self-image.
Part of the reason this happens is our desire to appear and feel like we’re right. We also want to show people our conviction. This shouldn’t come as a surprise. Society values consistency and conviction even when it is wrong.
We associate consistency with intellectual and personal strength, rationality, honesty, and stability. On the other hand, the person who is perceived as inconsistent is also seen as confused, two-faced, even mentally ill in certain extreme circumstances.
A politician, for example, who wavers, gets labelled a flip flopper and can lose an election over it (John Kerry). A CEO who risks everything on a successful bet and holds a conviction that no one else holds is held to be a hero (Elon Musk).
But it’s not just our words and actions that nudge our subconscious, but also how other people see us. There is a profound truth behind Eminem’s lyrics: I am, whatever you say I am. If I wasn’t, then why would I say I am?
If you think I’m talented, I become more talented in your eyes — in part because you labelling me as talented filters the way you see me. You start seeing more of my genius and less of my normal-ness, simply by way of staying consistent with your own words.
In his book Outliers, Malcolm Gladwell talks about how teachers simply identifying students as smart not only affected how the teachers saw their work but, more importantly, affected the opportunities teachers gave the students. Smarter students received better opportunities, which, we can reason, offers them better experiences. This is turn makes them better. It’s almost a self-fulfilling prophecy.
And the more we invest in our beliefs of ourselves or others—think money, effort, or pain, the more sunk costs we have and the harder it becomes to change our mind. It doesn’t matter if we’re right. It doesn’t matter if the Ikea bookshelf sucks, we’re going to love it.
In Too Much Invested to Quit, psychologist Allan Teger says something similar of the Vietnam War: “
The longer the war continued, the more difficult it was to justify the additional investments in terms of the value of possible victory. On the other hand, the longer the war continued, the more difficult it became to write off the tremendous losses without having anything to show for them.
As a consequence, there are few rules we abide by more than the “Don’t make any promises that you can’t keep.” This, generally speaking, is a great rule that keeps society together by ensuring that our commitments for the most part are real and reliable.
Aside from the benefits of preserving our public image, being consistent is simply easier and leads to a more predictable and consistent life. By being consistent in our habits and with previous decisions, we significantly reduce the need to think and can go on “auto-pilot” for most of our lives.
However beneficial these biases are, they too deserve deeper understanding and caution. Sometimes our drive to appear consistent can lure us into choices we otherwise would consider against our best interests. This is the essence of a harmful bias as opposed to a benign one: We are hurting ourselves and others by committing it.
A Slippery Slope
Part of why commitment can be so dangerous is because it is like a slippery slope – you only need a single slip to slide down completely. Therefore compliance to even tiny requests, which initially appear insignificant, have a good probability of leading to full commitment later.
People whose job it is to persuade us know this.
Among the more blunt techniques on the spectrum are those reported by a used-car sales manager in Robert Cialdini’s book Influence. The dealer knows the power of commitment and that if we comply a little now, we are likely to comply fully later on. His advice to other sellers goes as follows:
“Put ’em on paper. Get the customer’s OK on paper. Get the money up front. Control ’em. Control the deal. Ask ’em if they would buy the car right now if the price is right. Pin ’em down.”
This technique will be obvious to most of us. However, there are also more subtle ways to make us comply without us noticing.
A great example of a subtle compliance practitioner is Jo-Ellen Demitrius, the woman currently reputed to be the best consultant in the business of jury selection.
Whenever screening potential jurors before a trial she asks an artful question:
“If you were the only person who believed in my client’s innocence, could you withstand the pressure of the rest of the jury to change your mind?”
It’s unlikely that any self-respecting prospective juror would answer negatively. And, now that the juror has made the implicit promise, it is unlikely that once selected he will give in to the pressure exerted by the rest of the jury.
Innocent questions and requests like this can be a great springboard for initiating a cycle of compliance.
The Lenient Policy
A great case study for compliance is the tactics that Chinese soliders employed on American war captives during the Korean War. The Chinese were particularly effective in getting Americans to inform on one another. In fact, nearly all American prisoners in the Chinese camps are said to have collaborated with the enemy in one way or another.
This was striking, since such behavior was rarely observed among American war prisoners during WWII. It raises the question of what secret trades led to the success of the Chinese?
Unlike the North Koreans, the Chinese did not treat the victims harshly. Instead they engaged in what they called “lenient policy” towards the captives, which was, in reality, a clever series of psychological assaults.
In their exploits the Chinese relied heavily on commitment and consistency tactics to receive the compliance they desired. At first, the Americans were not too collaborative, as they had been trained to provide only name, rank, and serial number, but the Chinese were patient.
They started with seemingly small but frequent requests to repeat statements like “The United States is not perfect” and “In a Communist country, unemployment is not a problem.” Once these requests had been complied with, the heaviness of the requests grew. Someone who had just agreed that United States was not perfect would be encouraged to expand on his thoughts about specific imperfections. Later he might be asked to write up and read out a list of these imperfections in a discussion group with other prisoners. “After all, it’s what you really believe, isn’t it?” The Chinese would then broadcast the essay readings not only to the whole camp, but to other camps and even the American forces in South Korea. Suddenly the soldier would find himself a “collaborator” of the enemy.
The awareness that the essays did not contradict his beliefs could even change his self-image to be consistent with the new “collaborator” label, often resulting in more cooperation with the enemy.
It is not surprising that very few American soldiers were able to avoid such “collaboration” altogether.
Foot in the Door
The small request growing into bigger requests as applied by the Chinese on American soldiers is also called the Foot-in-the-door Technique. It was first discovered by two scientists – Freedman and Fraser, who had worked on an experiment in which a fake volunteer worker asked home owners to allow a public-service billboard to be installed on their front lawns.
To get a better idea of how it would look, the home owners were even shown a photograph depicting an attractive house that was almost completely obscured by an ugly sign reading DRIVE CAREFULLY. While the request was quite understandably denied by 83 percent of residents, one particular group reacted favorably.
Two weeks earlier a different “volunteer worker” had come and asked the respondents of this group a similar request to display a much smaller sign that read BE A SAFE DRIVER. The request was so negligible that nearly all of them complied. However, the future effects of that request turned out to be so enormous that 76 percent of this group complied with the bigger, much less reasonable request (the big ugly sign).
At first, even the researchers themselves were baffled by the results and repeated the experiment on similar setups. The effect persisted. Finally, they proposed that the subjects must have distorted their own views about themselves as a result of their initial actions:
What may occur is a change in the person’s feelings about getting involved or taking action. Once he has agreed to a request, his attitude may change, he may become, in his own eyes, the kind of person who does this sort of thing, who agrees to requests made by strangers, who takes action on things he believes in, who cooperates with good causes.
The rule goes that once someone has instilled our self-image where they want it to be, we will comply naturally with the set of requests that adhere to the new self-view. Therefore we must be very careful about agreeing to even the smallest requests. Not only can it make us comply with larger requests later on, but it can make us even more willing to do favors that are only remotely connected to the earlier ones.
Even Cialdini, someone who knows this bias inside-out, admits to his fear that his behavior will be affected by consistency bias:
It scares me enough that I am rarely willing to sign a petition anymore, even for a position I support. Such an action has the potential to influence not only my future behavior but also my self-image in ways I may not want.
Further, once a person’s self-image is altered, all sorts of subtle advantages become available to someone who wants to exploit that new image.
Give it, take it away later
Have you ever witnessed a deal that is a little too good to be true only to later be disappointed? You had already made up your mind, had gotten excited and were ready to pay or sign until a calculation error was discovered. Now with the adjusted price, the offer did not look all that great.
It is likely that the error was not an accident – this technique, also called low-balling, is often used by compliance professionals in sales. Cialdini, having observed the phenomenon among car dealers, tested its effects on his own students.
In an experiment with colleagues, he made two groups of students show up at 7:00 AM in the morning to do a study on “thinking processes”. When they called one group of students they immediately told them that the study starts at 7:00 AM. Unsurprisingly, only 24 percent wanted to participate.
However, for the other group of students, researchers threw a low-ball. The first question was whether they wanted to take part in a study of thinking processes. Fifty-six percent of them replied positively. Now, to those that agreed, the meeting time of 7:00 AM was revealed.
These students were given the opportunity to opt out, but none of them did. In fact, driven by their commitment, 95 percent of the low-balled students showed up to the Psychology Building at 7:00 AM as they had promised.
Do you recognize the similarities between the experiment and the sales situation?
The script of low-balling tends to be the same:
First, an advantage is offered that induces a favorable decision in the manipulator’s direction. Then, after the decision has been made, but before the bargain is sealed, the original advantage is deftly removed (i.e., the price is raised, the time is changed, etc.).
It would seem surprising that anyone would buy under these circumstances, yet many do. Often the self-created justifications provide so many new reasons for the decision that even when the dealer pulls away the original favorable rationale, like a low price, the decision is not changed. We stick with our old decision even in the face of new information!
Of course not everyone complies, but that’s not the point. The effect is strong enough to hold for a good number of buyers, students or anyone else whose rate of compliance we may want to raise.
The Way Out
The first real defense to consistency bias is awareness about the phenomenon and the harm a certain rigidity in our decisions can cause us.
Robert Cialdini suggests two approaches to recognizing when consistency biases are unduly creeping into our decision making. The first one is to listen to our stomachs. Stomach signs display themselves when we realize that the request being pushed is something we don’t want to do.
He recalls a time when a beautiful young woman tried to sell him a membership he most certainly did not need by using the tactics displayed above. He writes:
I remember quite well feeling my stomach tighten as I stammered my agreement. It was a clear call to my brain, “Hey, you’re being taken here!” But I couldn’t see a way out. I had been cornered by my own words. To decline her offer at that point would have meant facing a pair of distasteful alternatives: If I tried to back out by protesting that I was not actually the man-about-town I had claimed to be during the interview, I would come off a liar; trying to refuse without that protest would make me come off a fool for not wanting to save $1,200. I bought the entertainment package, even though I knew I had been set up. The need to be consistent with what I had already said snared me.
But then eventually he came up with the perfect counter-attack for later episodes, which allowed him to get out of the situation gracefully.
Whenever my stomach tells me I would be a sucker to comply with a request merely because doing so would be consistent with some prior commitment I was tricked into, I relay that message to the requester. I don’t try to deny the importance of consistency; I just point out the absurdity of foolish consistency. Whether, in response, the requester shrinks away guiltily or retreats in bewilderment, I am content. I have won; an exploiter has lost.
The second approach concerns the signs that are felt within our heart and is best used when it is not really clear whether the initial commitment was wrongheaded.
Imagine you have recognized that your initial assumptions about a particular deal were not correct. The car is not extraordinarily cheap and the experiment is not as fun if you have to wake up at 6 AM to make it. Here it helps to ask one simple question:
“Knowing what I know, if I could go back in time, would I make the same commitment?”
Ask it frequently enough and the answer might surprise you.
Want More? Check out our ever-growing library of mental models and biases.