Tag: Dan Ariely

Dan Ariely on How and Why We Cheat

Three years ago, Dan Ariely, a psychology and behavioral economics professor at Duke, put out a book called The (Honest) Truth About Dishonesty: How We Lie to Everyone–Especially Ourselves. I read the book back closer to when it was released, and I recently revisited it to see how it held up to my initial impressions.

It was even better. In fact, this is one of the most useful books I have ever come across, and my copy is now marked, flagged, and underlined. Let’s get in deep.

Why We Cheat

We’re Cheaters All

Dan is both an astute researcher and a good writer; he knows how to get to the point, and his points matter. His books, which include Predictably Irrational and The Upside of Irrationality, are not filled with fluff. We’ve mentioned his demonstrations of pluralistic ignorance here before.

In The Honest Truth, Ariely doesn’t just explore where cheating comes from but he digs into which situations make us more likely to cheat than others. Those discussions are what make the book eminently practical, and not just a meditation on cheating. It’s a how-to guide on our own dishonesty.

Ariely was led down that path because of a friend of his who had worked with Enron:

It was of course, possible that John and everyone else involved with Enron was deeply corrupt, but I began to think that there may have been a different type of dishonest at work–one that relates more to wishful blindness and is practiced by people like John, you, and me. I started wondering if the problem of dishonesty goes deeper than just a few bad apples and if this kind of wishful blindness takes place in other companies as well. I also wondered if my friends and I would have behaved similarly if we had been the ones consulting for Enron.

This is a beautiful setup that led him to a lot of interesting conclusions in his years of subsequent research. Here’s (some of) what Dan found.

  1. Cheating was standard, but only a little. Ariely and his co-researchers ran the same experiment in many different variations, and with many different topics to investigate. Nearly every time, he found evidence of a standard level of cheating. In other experiments, the outcome was the same. A little cheating was everywhere. People generally did not grab all they could, but only as much as they could justify psychologically.
  2. Increasing the cheating reward or moderately altering the risk of being caught didn’t affect the outcomes much. In Ariely’s experience, the cheating stayed steady: A little bit of stretching every time.
  3. The more abstracted from the cheating we are, the more we cheat. This was an interesting one–it turns out the less “connected” we feel to our dishonesty, the more we’re willing to do it. This ranges from being more willing to cheat to earn tokens exchangeable for real money than to earn actual money, to being more willing to “tap” a golf ball to improve its lie than actually pick it up and move it with our hands.
  4. A nudge not to cheat works better before we cheat than after. In other words, we need to strengthen our morals just before we’re tempted to cheat, not after. And even more interesting, when Ariely took his findings to the IRS and other organizations who could benefit from being cheated less, they barely let him in the door! The incentives in organizations are interesting.
  5. We think we’re more honest than everyone else. Ariely showed this pretty conclusively by studying golfers and asking them how much they thought others cheated and how much they thought they cheated themselves. It was a rout: They consistently underestimated their own dishonesty versus others’. I wasn’t surprised by this finding.
  6. We underestimate how blinded we can become to incentives. In a brilliant chapter called “Blinded by our Motivations,” Ariely discusses how incentives skew our judgment and our moral compass. He shows how pharma reps are masters of this game–and yet we allow it to continue. If we take Ariely seriously, the laws against conflicts of interest need to be stronger.
  7. Related to (6), disclosure does not seem to decrease incentive-caused bias. This reminds me of Charlie Munger’s statement, “I think I’ve been in the top 5% of my age cohort all my life in understanding the power of incentives, and all my life I’ve underestimated it. Never a year passes that I don’t get some surprise that pushes my limit a little farther.” Ariely has discussed incentive-caused bias in teacher evaluation before.
  8. We cheat more when our willpower is depleted. This doesn’t come as a total surprise: Ariely found that when we’re tired and have exerted a lot of mental or physical energy, especially in resisting other temptations, we tend to increase our cheating. (Or perhaps more accurately, decrease our non-cheating.)
  9. We cheat ourselves, even if we have direct incentive not to. Ariely was able to demonstrate that even with a strong financial incentive to honestly assess our own abilities, we still think we cheat less than we do, and we hurt ourselves in the process.
  10. Related to (9), we can delude ourselves into believing we were honest all along. This goes to show the degree to which we can damage ourselves by our cheating as much as others. Ariely also discusses how good we are at pounding our own conclusions into our brain even if no one else is being persuaded, as Munger has mentioned before.
  11. We cheat more when we believe the world “owes us one.” This section of the book should feel disturbingly familiar to anyone. When we feel like we’ve been cheated or wronged “over here,” we let the universe make it up to us “over there.” (By cheating, of course.) Think about the last time you got cut off in traffic, stiffed on proper change, and then unloaded on by your boss. Didn’t you feel more comfortable reaching for what wasn’t yours afterwards? Only fair, right?
  12. Unsurprisingly, cheating has a social contagion aspect. If we see someone who we identify with and whose group we feel we belong to cheating, it makes us (much) more likely to cheat. This has wide-ranging social implications.
  13. Finally, nudging helps us cheat less. If we’re made more aware of our moral compass through specific types of reminders and nudges, we can decrease our own cheating. Perhaps most important is to keep ourselves out of situations where we’ll be tempted to cheat or act dishonestly, and to take pre-emptive action if it’s unavoidable.

There’s much more in the book, and we highly recommend you read it for that as well as Dan’s general theory on cheating. The final chapter on the steps that old religions have taken to decrease dishonesty among their followers is a fascinating bonus. (Reminded me of Nassim Taleb’s retort that heavy critics of religion, like Dawkins, take it too literally and under-appreciate the social value of its rules and customs. It’s also been argued that religion has an evolutionary basis.)

Check out the book, and while you’re at up, pick up his other two: Predictably Irrational, and The Upside of Irrationality.

Pluralistic Ignorance: You’re Not Alone

“If everyone is thinking alike, then somebody isn’t thinking.”

— George S. Patton

Imagine you’re in a meeting with a lot of important people. The boss comes in, takes a seat, and starts talking “strategic market knowledge” this and “leveraging competitive advantages” that.

It all sounds like gibberish to you. It doesn’t mean anything.

For a second you wonder if you’re in the right meeting. Surely someone else must feel as confused as you?

So you take a quick sanity check. You look around the room at your colleagues and … what??

They are paying attention and nodding their head in total agreement? How can this be?

They must know something you don’t know.

You quickly determine the best option is to keep your mouth shut and say nothing, hiding what you think is your own ignorance. But you’re not alone. Everyone is thinking the same thing.

Pluralistic Ignorance

The word for this is pluralistic ignorance, a psychological state characterized by the belief that one’s private thoughts are different from those of others. The term was coined in 1932 by psychologists Daniel Katz and Floyd Allport and describes the common group situation where we privately believe one thing, but feel everyone else in the group believes something else.

In the case above, pluralistic ignorance means that rather than interrupting the meeting to ask for a clarification, we’ll sit tight and nod like everyone else. It’s a real life version of The Emperor’s New Clothes, the fairy tale where everyone pretends the king is wearing clothes until a child points out the emperor isn’t wearing any clothes.

When You Think Your Alone

If you scratch below the surface though, what’s really happening with pluralistic ignorance is that you’re not accurately accessing how others in your group are thinking. The happens most when you feel you have a view that isn’t shared by a large percentage of other people.

In this short video, Dan Ariely, explains and demonstrates pluralistic ignorance better than I can. Make sure you watch the whole thing, the kicker is at the end.

Basically we look toward others for cues about how to act when we really should take a page out of Richard Feynman’s book: What Do You Care What Other People Think?

The (Honest) Truth About Dishonesty: How We Lie to Everyone—Especially Ourselves

In his book, The (Honest) Truth About Dishonesty: How We Lie to Everyone—Especially Ourselves, Dan Ariely attempts to answer the question: “is dishonesty largely restricted to a few bad apples or is it a more widespread problem?”

He concludes that we’re mostly honest as long as the conditions are right:

We are going to take things from each other if we have a chance … many people need controls around them for them to do the right thing. … [T]he locksmith told Peter that locks are on doors only to keep honest people honest. “One percent of people will always be honest and never steal,” the locksmith said. “Another one percent will always be dishonest and always try to pick your lock and steal your television. And the rest will be honest as long as the conditions are right—but if they are tempted enough, they’ll be dishonest too. Locks won’t protect you from the thieves, who can get in your house if they really want to. They will only protect you from the mostly honest people who might be tempted to try your door if it had no lock.”

We’re ok cheating, as long as its just a little and unnoticeable.

as long as we cheat by only a little bit, we can benefit from cheating and still view ourselves as marvelous human beings. This balancing act is the process of rationalization, and it is the basis of what we’ll call the fudge factor theory.

Something that stood out for me was the chapter on the relationship between creativity and dishonesty. According to Ariely, the link between creativity and dishonesty is not as straightforward as we might think — The more creative we are the better we are at rationalising dishonest behavior.

We may not always know exactly why we do what we do, choose what we choose, or feel what we feel. But the obscurity of our real motivations doesn’t stop us from creating perfectly logical-sounding reasons for our actions, decisions, and feelings.

… We all want explanations for why we behave as we do and for the ways the world around us functions. Even when our feeble explanations have little to do with reality. We’re storytelling creatures by nature, and we tell ourselves story after story until we come up with an explanation that we like and that sounds reasonable enough to believe. And when the story portrays us in a more glowing and positive light, so much the better.

We don’t make rational decisions. Our choices are (mostly) not based on explicit preferences and thought through. Rather, we follow our intuition with “mental gymnastics” to justify our actions. Conveniently this allows us to get what we want and maintain our ego. We tell ourselves that we are acting rationally. The real difference Ariely found between more and less creative people is the creativity of the justifications. “The most creative we are,” he writes, “the more we are able to come up with good stories that help us justify our selfish interests.”

This really comes down to our storytelling nature:

We’re storytelling creatures by nature, and we tell ourselves story after story until we come up with an explanation that we like and that sounds reasonable enough to believe. And when the story portrays us in a more glowing and positive light, so much the better.

The idea that worries Ariely the most is the trend toward cashless payments. “From all the research I have done over the years,” he writes, “the idea that worries me the most is that the more cashless our society becomes, the more our moral compass slips.”

One factor that Ariely didn’t contemplate that I think it is important is how our environment — whether we’re in an environment of abundance or scarcity — affects our moral compass. Intuitively, I think it’s a lot easier to rationalise moral transgressions in an environment of scarcity than one of abundance.

“Essentially, we cheat up to the level that allows us to retain our self-image as reasonably honest individuals.”

— Dan Ariely

The (Honest) Truth About Dishonesty: How We Lie to Everyone—Especially Ourselves is worth reading in its entirety.

The Difference Between Persuade, Convince, and Coerce

The difference is worth understanding.

In a recent slate article, K.C. Cole writes:

Persuasion requires understanding. Coercion requires only power. We usually equate coercion with obvious force, but sometimes it’s far more subtle. If you want people to stop smoking, for example, you don’t need to make it illegal; you can simply make smoking expensive (raise taxes) or offer bribes (lower health insurance premiums). Both are still coercive in that the power to give or take away resides entirely in the hands of the “coercer.”

Persuasion is fundamentally different because it relies on understanding what smoking does to the human body. Someone who’s persuaded of its dangers has an incentive to stop that’s entirely independent of anyone else’s actions.

I agree that coercion involves the use of (or the threat of) force.

Where I disagree — and where this gets slightly murky — is that I don’t think you need to fully understand something (at least at a conscious level) to be persuaded to act. That assumes persuasion is rational.

I think you are persuaded by appeals to the irrational — emotions, psychology, and imagination.

Understanding something (e.g., what smoking does to the human body) largely comes from facts or arguments that appeal to intellect. When I get you to do something based on facts and reason I’m convincing you to act, which is different from persuading you to act.

Seth Goldin devised an interesting heuristic to think about this — “Engineers convince. Marketers persuade.”

Cole continues:

It’s a distinction I think about often in teaching. If I get students to do things a certain way for fear of getting an F or hopes of getting an A, it means I’ve influenced their behavior for the duration of the class. If I’ve managed to persuade them that my method has merit, I’ve likely made converts for life.

Cole argues that you can be coerced into doing something for the duration of class, yet persuaded by merit to do it for life. That’s an appealing argument but it’s flawed.

If you’re persuading someone to do something by merit then you’re appealing to intellect and reason not emotions or imagination — that’s not persuading them, it’s convincing them.

While morally better than coercion, I doubt Cole’s appeal to reason alone would create a lifelong change in his students. Such a successful outcome (changing behavior for life) would likely be the result of a confluence of factors, not just one.

If I’m trying to get you to do something, there are a number of possible end states (for simplicity, I’ll remove coercion). You can be (1) convinced of something but don’t take action (e.g., I can convince you that smoking is bad for you, yet you fail to quit); (2) convinced of something and you do take action (e.g., I convince you smoking is bad and you quit); (3) convinced and persuaded (e.g., maybe you were in the camp of #1 but now I’ve persuaded to act); (4) unpersuaded and unconvinced; or (5) unconvinced yet persuaded to act.

I think Cole convinced but didn’t persuade his students (#2).

I looked up ‘persuade/convince’ in my copy of Garner’s Modern American Usage. The entry reads:

persuade; convince. In the best usage, one persuades another to do something but convinces another of something.

Of course, coming from a usage dictionary you get also get usage instructions:

Avoid convince to—the phrasing *she convinced him to resign is traditionally viewed as less good than she persuaded him to resign.

But that means that you can never be convinced to do something – only persuaded. I don’t agree.

I think Seth Goldin is closer to the mark. He points out:

Persuasion appeals to the emotions and to fear and to the imagination. Convincing requires a spreadsheet or some other rational device.

You can convince someone to do something based on reason. You can coerce someone to do something under threat. The way to persuade someone, however, is to appeal to their emotions.

The hardest thing to do is convince someone they’re wrong. If you find yourself in this circumstance, attempt to persuade them.

It’s easier to persuade someone if you’ve convinced them, and it’s easier to convince them if you’ve persuaded them.

Persuading > Convincing > Coercion

Ideally you want to convince and persuade.

Happy Holidays!

 

Everyone Lies. Dan Ariely Explains Why

Research shows that nearly everyone cheats a little if given the opportunity. Dan Ariely, author of the new book, “The (Honest) Truth About Dishonesty,” explains why.

Over the past decade or so, my colleagues and I have taken a close look at why people cheat, using a variety of experiments and looking at a panoply of unique data sets—from insurance claims to employment histories to the treatment records of doctors and dentists. What we have found, in a nutshell: Everybody has the capacity to be dishonest, and almost everybody cheats—just by a little. Except for a few outliers at the top and bottom, the behavior of almost everyone is driven by two opposing motivations. On the one hand, we want to benefit from cheating and get as much money and glory as possible; on the other hand, we want to view ourselves as honest, honorable people. Sadly, it is this kind of small-scale mass cheating, not the high-profile cases, that is most corrosive to society.

Knowing that most people cheat—but just by a little—the next logical question is what makes us cheat more or less.

One thing that increased cheating in our experiments was making the prospect of a monetary payoff more “distant,” in psychological terms.

and

Another thing that boosted cheating: Having another student in the room who was clearly cheating. … Other factors that increased the dishonesty of our test subjects included knowingly wearing knockoff fashions, being drained from the demands of a mentally difficult task and thinking that “teammates” would benefit from one’s cheating in a group version of the matrix task. These factors have little to do with cost-benefit analysis and everything to do with the balancing act that we are constantly performing in our heads. If I am already wearing fake Gucci sunglasses, then maybe I am more comfortable pushing some other ethical limits (we call this the “What the hell” effect). If I am mentally depleted from sticking to a tough diet, how can you expect me to be scrupulously honest? (It’s a lot of effort!) If it is my teammates who benefit from my fudging the numbers, surely that makes me a virtuous person!

What, then—if anything—pushes people toward greater honesty?

… simply being reminded of moral codes has a significant effect on how we view our own behavior.

Inspired by the thought, my colleagues and I ran an experiment at the University of California, Los Angeles. We took a group of 450 participants, split them into two groups and set them loose on our usual matrix task. We asked half of them to recall the Ten Commandments and the other half to recall 10 books that they had read in high school. Among the group who recalled the 10 books, we saw the typical widespread but moderate cheating. But in the group that was asked to recall the Ten Commandments, we observed no cheating whatsoever. We reran the experiment, reminding students of their schools’ honor codes instead of the Ten Commandments, and we got the same result. We even reran the experiment on a group of self-declared atheists, asking them to swear on a Bible, and got the same no-cheating results yet again.

This experiment has obvious implications for the real world. While ethics lectures and training seem to have little to no effect on people, reminders of morality—right at the point where people are making a decision—appear to have an outsize effect on behavior.

One key takeaway:

All of this means that, although it is obviously important to pay attention to flagrant misbehaviors, it is probably even more important to discourage the small and more ubiquitous forms of dishonesty—the misbehavior that affects all of us, as both perpetrators and victims.

***

Still curious? This piece is adapted from his forthcoming book, The (Honest) Truth About Dishonesty: How We Lie to Everyone-Especially Ourselves.

Footnotes

Five Book recommendations from Dan Ariely on Behavioural Economics

Dan Ariely, professor of psychology and behavorial economics, says we can all be more aware of our surroundings and our decision-making process. He suggests the following five books:

The Invisible Gorilla

We think we see with our eyes, but the reality is that we largely see with our brains. Our brain is a master at giving us what we expect to see. It’s all about expectation, and when things violate expectation we are just unaware of them. We go around the world with a sense that we pay attention to lots of things. The reality is that we notice much less than we think. And if we notice so much less than we think, what does that mean about our ability to figure out things around us, to learn and improve? It means we have a serious problem. I think this book has done a tremendous job in showing how even in vision, which is such a good system in general, we are poorly tooled to make good decisions.

Mindless Eating

This is one of my favourite books. He takes many of these findings about decision-making and shows how they work in the domain of food. Food is tangible, so it helps us understand the principles.

The Person and the Situation

This is an oldie but a goodie. It’s a book that shows how when we make decisions, we think personality plays a big role. “I’m the kind of person who does this, or I’m the kind of person who does that.” The reality is that the environment in which we make decisions determines a lot of what we do. Mindless Eating is also about that – how the food environment affects us. Nudge is also about that – how we can actually design the environment or external influences to make better decisions. But The Person and the Situation was the first book to articulate how we think we are making decisions, when the reality is that the environment around us has a lot to do with it.

Influence

The Cialdini book is very important because it covers a range of ways in which we end up doing things, and how we don’t understand why we’re doing them. It also shows you how much other people have control, at the end of the day, over our actions. Both of these elements are crucial. The book is becoming even more important these days.

Nudge

One of the reasons Nudge is so important is because it’s taking these ideas and applying them to the policy domain. Here are the mistakes we make. Here are the ways marketers are trying to influence us. Here’s the way we might be able to fight back. If policymakers understood these principles, what could they do? The other important thing about the book is that it describes, in detail, small interventions. It’s basically a book about cheap persuasion.

Dan Ariely is the best-selling author of The Upside of Irrationality: The Unexpected Benefits of Defying Logic at Work and at Home and Predictably Irrational, Revised and Expanded Edition: The Hidden Forces That Shape Our Decisions.

12