Tag: Availability Bias

The Availability Bias: How to Overcome a Common Cognitive Distortion

“The attention which we lend to an experience is proportional to its vivid or interesting character, and it is a notorious fact that what interests us most vividly at the time is, other things equal, what we remember best.” —William James

The availability heuristic explains why winning an award makes you more likely to win another award. It explains why we sometimes avoid one thing out of fear and end up doing something else that’s objectively riskier. It explains why governments spend enormous amounts of money mitigating risks we’ve already faced. It explains why the five people closest to you have a big impact on your worldview. It explains why mountains of data indicating something is harmful don’t necessarily convince everyone to avoid it. It explains why it can seem as if everything is going well when the stock market is up. And it explains why bad publicity can still be beneficial in the long run.

Here’s how the availability heuristic works, how to overcome it, and how to use it to your advantage.

***

How the availability heuristic works

Before we explain the availability heuristic, let’s quickly recap the field it comes from.

Behavioral economics is a field of study bringing together knowledge from psychology and economics to reveal how real people behave in the real world. This is in contrast to the traditional economic view of human behavior, which assumed people always behave in accordance with rational, stable interests. The field largely began in the 1960s and 1970s with the work of psychologists Amos Tversky and Daniel Kahneman.

Behavioral economics posits that people often make decisions and judgments under uncertainty using imperfect heuristics, rather than by weighing up all of the relevant factors. Quick heuristics enable us to make rapid decisions without taking the time and mental energy to think through all the details.

Most of the time, they lead to satisfactory outcomes. However, they can bias us towards certain consistently irrational decisions that contradict what economics would tell us is the best choice. We usually don’t realize we’re using heuristics, and they’re hard to change even if we’re actively trying to be more rational.

One such cognitive shortcut is the availability heuristic, first studied by Tversky and Kahneman in 1973. We tend to judge the likelihood and significance of things based on how easily they come to mind. The more “available” a piece of information is to us, the more important it seems. The result is that we give greater weight to information we learned recently because a news article you read last night comes to mind easier than a science class you took years ago. It’s too much work to try to comb through every piece of information that might be in our heads.

We also give greater weight to information that is shocking or unusual. Shark attacks and plane crashes strike us more than an accidental drowning or car accidents, so we overestimate their odds.

If we’re presented with a set of similar things with one that differs from the rest, we’ll find it easier to remember. For example, of the sequence of characters “RTASDT9RTGS,” the most common character remembered would be the “9” because it stands out from the letters.

In Behavioural Law and Economics, Timur Kuran and Cass Sunstein write:

“Additional examples from recent years include mass outcries over Agent Orange, asbestos in schools, breast implants, and automobile airbags that endanger children. Their common thread is that people tended to form their risk judgments largely, if not entirely, on the basis of information produced through a social process, rather than personal experience or investigation. In each case, a public upheaval occurred as vast numbers of players reacted to each other’s actions and statements. In each, moreover, the demand for swift, extensive, and costly government action came to be considered morally necessary and socially desirable—even though, in most or all cases, the resulting regulations may well have produced little good, and perhaps even relatively more harm.”

Narratives are more memorable than disjointed facts. There’s a reason why cultures around the world teach important life lessons and values through fables, fairy tales, myths, proverbs, and stories.

Personal experience can also make information more salient. If you’ve recently been in a car accident, you may well view car accidents as more common in general than you did before. The base rates haven’t changed; you just have an unpleasant, vivid memory coming to mind whenever you get in a car. We too easily assume that our recollections are representative and true and discount events that are outside of our immediate memory. To give another example, you may be more likely to buy insurance against a natural disaster if you’ve just been impacted by one than you are before it happens.

Anything that makes something easier to remember increases its impact on us. In an early study, Tversky and Kahneman asked subjects whether a random English word is more likely to begin with “K” or have “K” as the third letter. Seeing as it’s typically easier to recall words beginning with a particular letter, people tended to assume the former was more common. The opposite is true.

In Judgment Under Uncertainty: Heuristics and Biases, Tversky and Kahneman write:

“…one may estimate probability by assessing availability, or associative distance. Lifelong experience has taught us that instances of large classes are recalled better and faster than instances of less frequent classes, that likely occurrences are easier to imagine than unlikely ones, and that associative connections are strengthened when two events frequently co-occur.

…For example, one may assess the divorce rate in a given community by recalling divorces among one’s acquaintances; one may evaluate the probability that a politician will lose an election by considering various ways in which he may lose support; and one may estimate the probability that a violent person will ‘see’ beasts of prey in a Rorschach card by assessing the strength of association between violence and beasts of prey. In all of these cases, the assessment of the frequency of a class or the probability of an event is mediated by an assessment of availability.”

They go on to write:

“That associative bonds are strengthened by repetition is perhaps the oldest law of memory known to man. The availability heuristic exploits the inverse form of this law, that is, it uses strength of association as a basis for the judgment of frequency. In this theory, availability is a mediating variable, rather than a dependent variable as is typically the case in the study of memory.”

***

How the availability heuristic misleads us

“People tend to assess the relative importance of issues by the ease with which they are retrieved from memory—and this is largely determined by the extent of coverage in the media.” —Daniel Kahneman, Thinking Fast and Slow

To go back to the points made in the introduction of this post, winning an award can make you more likely to win another award because it gives you visibility, making your name come to mind more easily in connection to that kind of accolade. We sometimes avoid one thing in favor of something objectively riskier, like driving instead of taking a plane, because the dangers of the latter are more memorable. The five people closest to you can have a big impact on your worldview because you frequently encounter their attitudes and opinions, bringing them to mind when you make your own judgments. Mountains of data indicating something is harmful don’t always convince people to avoid it if those dangers aren’t salient, such as if they haven’t personally experienced them. It can seem as if things are going well when the stock market is up because it’s a simple, visible, and therefore memorable indicator. Bad publicity can be beneficial in the long run if it means something, such as a controversial book, gets mentioned often and is more likely to be recalled.

These aren’t empirical rules, but they’re logical consequences of the availability heuristic, in the absence of mitigating factors.

We are what we remember, and our memories have a significant impact on our perception of the world. What we end up remembering is influenced by factors such as the following:

  • Our foundational beliefs about the world
  • Our expectations
  • The emotions a piece of information inspires in us
  • How many times we’re exposed to a piece of information
  • The source of a piece of information

There is no real link between how memorable something is and how likely it is to happen. In fact, the opposite is often true. Unusual events stand out more and receive more attention than commonplace ones. As a result, the availability heuristic skews our perception of risks in two key ways:

We overestimate the likelihood of unlikely events. And we underestimate the likelihood of likely events.

Overestimating the risk of unlikely events leads us to stay awake at night, turning our hair grey, worrying about things that have almost no chance of happening. We can end up wasting enormous amounts of time, money, and other resources trying to mitigate things that have, on balance, a small impact. Sometimes those mitigation efforts end up backfiring, and sometimes they make us feel safer than they should.

On the flipside, we can overestimate the chance of unusually good things happening to us. Looking at everyone’s highlights on social media, we can end up expecting our own lives to also be a procession of grand achievements and joys. But most people’s lives are mundane most of the time, and the highlights we see tend to be exceptional ones, not routine ones.

Underestimating the risk of likely events leads us to fail to prepare for predictable problems and occurrences. We’re so worn out from worrying about unlikely events, we don’t have the energy to think about what’s in front of us. If you’re stressed and anxious much of the time, you’ll have a hard time paying attention to those signals when they really matter.

All of this is not to say that you shouldn’t prepare for the worst. Or that unlikely things never happen (as Littlewood’s Law states, you can expect a one-in-a-million event at least once per month.) Rather, we should be careful about only preparing for the extremes because those extremes are more memorable.

***

How to overcome the availability heuristic

Knowing about a cognitive bias isn’t usually enough to overcome it. Even people like Kahneman who have studied behavioral economics for many years sometimes struggle with the same irrational patterns. But being aware of the availability heuristic is helpful for the times when you need to make an important decision and can step back to make sure it isn’t distorting your view. Here are five ways of mitigating the availability heuristic.

#1. Always consider base rates when making judgments about probability.
The base rate of something is the average prevalence of it within a particular population. For example, around 10% of the population are left-handed. If you had to guess the likelihood of a random person being left-handed, you would be correct to say 1 in 10 in the absence of other relevant information. When judging the probability of something, look at the base rate whenever possible.

#2. Focus on trends and patterns.
The mental model of regression to the mean teaches us that extreme events tend to be followed by more moderate ones. Outlier events are often the result of luck and randomness. They’re not necessarily instructive. Whenever possible, base your judgments on trends and patterns—the longer term, the better. Track record is everything, even if outlier events are more memorable.

#3. Take the time to think before making a judgment.
The whole point of heuristics is that they save the time and effort needed to parse a ton of information and make a judgment. But, as we always say, you can’t make a good decision without taking time to think. There’s no shortcut for that. If you’re making an important decision, the only way to get around the availability heuristic is to stop and go through the relevant information, rather than assuming whatever comes to mind first is correct.

#4. Keep track of information you might need to use in a judgment far off in the future.
Don’t rely on memory. In Judgment in Managerial Decision-Making, Max Bazerman and Don Moore present the example of workplace annual performance appraisals. Managers tend to base their evaluations more on the prior three months than the nine months before that. It’s much easier than remembering what happened over the course of an entire year. Managers also tend to give substantial weight to unusual one-off behavior, such as a serious mistake or notable success, without considering the overall trend. In this case, noting down observations on someone’s performance throughout the entire year would lead to a more accurate appraisal.

#5. Go back and revisit old information.
Even if you think you can recall everything important, it’s a good idea to go back and refresh your memory of relevant information before making a decision.

The availability heuristic is part of Farnam Street’s latticework of mental models.

Stop Preparing For The Last Disaster

When something goes wrong, we often strive to be better prepared if the same thing happens again. But the same disasters tend not to happen twice in a row. A more effective approach is simply to prepare to be surprised by life, instead of expecting the past to repeat itself.

***

If we want to become less fragile, we need to stop preparing for the last disaster.

When disaster strikes, we learn a lot about ourselves. We learn whether we are resilient, whether we can adapt to challenges and come out stronger. We learn what has meaning for us, we discover core values, and we identify what we’re willing to fight for. Disaster, if it doesn’t kill us, can make us stronger. Maybe we discover abilities we didn’t know we had. Maybe we adapt to a new normal with more confidence. And often we make changes so we will be better prepared in the future.

But better prepared for what?

After a particularly trying event, most people prepare for a repeat of whatever challenge they just faced. From the micro level to the macro level, we succumb to the availability bias and get ready to fight a war we’ve already fought. We learn that one lesson, but we don’t generalize that knowledge or expand it to other areas. Nor do we necessarily let the fact that a disaster happened teach us that disasters do, as a rule, tend to happen. Because we focus on the particulars, we don’t extrapolate what we learn to identifying what we can better do to prepare for adversity in general.

We tend to have the same reaction to challenge, regardless of the scale of impact on our lives.

Sometimes the impact is strictly personal. For example, our partner cheats on us, so we vow never to have that happen again and make changes designed to catch the next cheater before they get a chance; in future relationships, we let jealousy cloud everything.

But other times, the consequences are far reaching and impact the social, cultural, and national narratives we are a part of. Like when a terrorist uses an airplane to attack our city, so we immediately increase security at airports so that planes can never be used again to do so much damage and kill so many people.

The changes we make may keep us safe from a repeat of those scenarios that hurt us. The problem is, we’re still fragile. We haven’t done anything to increase our resilience—which means the next disaster is likely to knock us on our ass.

Why do we keep preparing for the last disaster?

Disasters cause pain. Whether it’s emotional or physical, the hurt causes vivid and strong reactions. We remember pain, and we want to avoid it in the future through whatever means possible. The availability of memories of our recent pain informs what we think we should do to stop it from happening again.

This process, called the availability bias, has significant implications for how we react in the aftermath of disaster. Writing in The Legal Analyst: A Toolkit for Thinking about the Law about the information cascades this logical fallacy sets off, Ward Farnsworth says they “also help explain why it’s politically so hard to take strong measures against disasters before they have happened at least once. Until they occur they aren’t available enough to the public imagination to seem important; after they occur their availability cascades and there is an exaggerated rush to prevent the identical thing from happening again. Thus after the terrorist attacks on the World Trade Center, cutlery was banned from airplanes and invasive security measures were imposed at airports. There wasn’t the political will to take drastic measures against the possibility of nuclear or other terrorist attacks of a type that hadn’t yet happened and so weren’t very available.”

In the aftermath of a disaster, we want to be reassured of future safety. We lived through it, and we don’t want to do so again. By focusing on the particulars of a single event, however, we miss identifying the changes that will improve our chances of better outcomes next time. Yes, we don’t want any more planes to fly into buildings. But preparing for the last disaster leaves us just as underprepared for the next one.

What might we do instead?

We rarely take a step back and go beyond the pain to look at what made us so vulnerable to it in the first place. However, that’s exactly where we need to start if we really want to better prepare ourselves for future disaster. Because really, what most of us want is to not be taken by surprise again, caught unprepared and vulnerable.

The reality is that the same disaster is unlikely to happen twice. Your next lover is unlikely to hurt you in the same way your former one did, just as the next terrorist is unlikely to attack in the same way as their predecessor. If we want to make ourselves less fragile in the face of great challenge, the first step is to accept that you are never going to know what the next disaster will be. Then ask yourself: How can I prepare anyway? What changes can I make to better face the unknown?

As Andrew Zolli and Ann Marie Healy explain in Resilience: Why Things Bounce Back, “surprises are by definition inevitable and unforeseeable, but seeking out their potential sources is the first step toward adopting the open, ready stance on which resilient responses depend.”

Giving serious thought to the range of possible disasters immediately makes you aware that you can’t prepare for all of them. But what are the common threads? What safeguards can you put in place that will be useful in a variety of situations? A good place to start is increasing your adaptability. The easier you can adapt to change, the more flexibility you have. More flexibility means having more options to deal with, mitigate, and even capitalize on disaster.

Another important mental tool is to accept that disasters will happen. Expect them. It’s not about walking around every day with your adrenaline pumped in anticipation; it’s about making plans assuming that they will get derailed at some point. So you insert backup systems. You create a cushion, moving away from razor-thin margins. You give yourself the optionality to respond differently when the next disaster hits.

Finally, we can find ways to benefit from disaster. Author and economist Keisha Blair, in Holistic Wealth, suggests that “building our resilience muscles starts with the way we process the negative events in our lives. Mental toughness is a prerequisite for personal growth and success.” She further writes, “adversity allows us to become better rounded, richer in experience, and to strengthen our inner resources.” We can learn from the last disaster how to grow and leverage our experiences to better prepare for the next one.

Predicting the Future with Bayes’ Theorem

In a recent podcast, we talked with professional poker player Annie Duke about thinking in probabilities, something good poker players do all the time. At the poker table or in life, it’s useful to think in probabilities versus absolutes based on all the information you have available to you. You can improve your decisions and get better outcomes.

Probabilistic thinking leads you to ask yourself, how confident am I in this prediction? What information would impact this confidence?

Bayes’ Theorem

Bayes’ theorem is an accessible way of integrating probability thinking into our lives. Thomas Bayes was an English minister in the 18th century, whose most famous work, “An Essay toward Solving a Problem in the Doctrine of Chances,” was brought to the attention of the Royal Society in 1763—two years after his death—by his friend Richard Price. The essay did not contain the theorem as we now know it but had the seeds of the idea. It looked at how we should adjust our estimates of probabilities when we encounter new data that influence a situation. Later development by French scholar Pierre-Simon Laplace and others helped codify the theorem and develop it into a useful tool for thinking.

Knowing the exact math of probability calculations is not the key to understanding Bayesian thinking. More critical is your ability and desire to assign probabilities of truth and accuracy to anything you think you know, and then being willing to update those probabilities when new information comes in. Here is a short example, found in Investing: The Last Liberal Art, of how it works:

Let’s imagine that you and a friend have spent the afternoon playing your favorite board game, and now, at the end of the game, you are chatting about this and that. Something your friend says leads you to make a friendly wager: that with one roll of the die from the game, you will get a 6. Straight odds are one in six, a 16 percent probability. But then suppose your friend rolls the die, quickly covers it with her hand, and takes a peek. “I can tell you this much,” she says; “it’s an even number.” Now you have new information and your odds change dramatically to one in three, a 33 percent probability. While you are considering whether to change your bet, your friend teasingly adds: “And it’s not a 4.” With this additional bit of information, your odds have changed again, to one in two, a 50 percent probability. With this very simple example, you have performed a Bayesian analysis. Each new piece of information affected the original probability, and that is Bayesian [updating].

Both Nate Silver and Eliezer Yudkowsky have written about Bayes’ theorem in the context of medical testing, specifically mammograms. Imagine you live in a country with 100 million women under 40. Past trends have revealed that there is a 1.4% chance of a woman under 40 in this country getting breast cancer—so roughly 1.4 million women.

Mammograms will detect breast cancer 75% of the time. They will give out false positives—say a woman has breast cancer when she actually doesn’t—about 10% of the time. At first, you might focus just on the mammogram numbers and think that a 75% success rate means that a positive is bad news. Let’s do the math.

If all the women under 40 get mammograms, then the false positive rate will give 10 million women under 40 the news that they have breast cancer. But because you know the first statistic, that only 1.4 women under 40 actually get breast cancer, you know that 8.6 million of the women who tested positive are not actually going to have breast cancer!
That’s a lot of needless worrying, which leads to a lot of needless medical care. To remedy this poor understanding and make better decisions about using mammograms, we absolutely must consider prior knowledge when we look at the results, and try to update our beliefs with that knowledge in mind.

Weigh the Evidence

Often we ignore prior information, simply called “priors” in Bayesian-speak. We can blame this habit in part on the availability heuristic—we focus on what’s readily available. In this case, we focus on the newest information, and the bigger picture gets lost. We fail to adjust the probability of old information to reflect what we have learned.

The big idea behind Bayes’ theorem is that we must continuously update our probability estimates on an as-needed basis. In their book The Signal and the Noise, Nate Silver and Allen Lane give a contemporary example, reminding us that new information is often most useful when we put it in the larger context of what we already know:

Bayes’ theorem is an important reality check on our efforts to forecast the future. How, for instance, should we reconcile a large body of theory and evidence predicting global warming with the fact that there has been no warming trend over the last decade or so? Skeptics react with glee, while true believers dismiss the new information.

A better response is to use Bayes’ theorem: the lack of recent warming is evidence against recent global warming predictions, but it is weak evidence. This is because there is enough variability in global temperatures to make such an outcome unsurprising. The new information should reduce our confidence in our models of global warming—but only a little.

The same approach can be used in anything from an economic forecast to a hand of poker, and while Bayes’ theorem can be a formal affair, Bayesian reasoning also works as a rule of thumb. We tend to either dismiss new evidence, or embrace it as though nothing else matters. Bayesians try to weigh both the old hypothesis and the new evidence in a sensible way.

Limitations of the Bayesian

Don’t walk away thinking the Bayesian approach will enable you to predict everything! In addition to seeing the world as an ever-shifting array of probabilities, we must also remember the limitations of inductive reasoning.

A high probability of something being true is not the same as saying it is true. Consider this example from Bertrand Russell’s The Problems of Philosophy:

A horse which has been often driven along a certain road resists the attempt to drive him in a different direction. Domestic animals expect food when they see the person who usually feeds them. We know that all these rather crude expectations of uniformity are liable to be misleading. The man who has fed the chicken every day throughout its life at last wrings its neck instead, showing that more refined views as to the uniformity of nature would have been useful to the chicken.

In the final analysis, though, picking up Bayesian reasoning can change your life, as observed in this Big Think video by Julia Galef of the Center for Applied Rationality:

After you’ve been steeped in Bayes’ rule for a little while, it starts to produce some fundamental changes to your thinking. For example, you become much more aware that your beliefs are grayscale. They’re not black and white and that you have levels of confidence in your beliefs about how the world works that are less than 100 percent but greater than zero percent and even more importantly as you go through the world and encounter new ideas and new evidence, that level of confidence fluctuates, as you encounter evidence for and against your beliefs.

So be okay with uncertainty, and use it to your advantage. Instead of holding on to outdated beliefs by rejecting new information, take in what comes your way through a system of evaluating probabilities.

Bayes’ Theorem is part of the Farnam Street latticework of mental models. Still Curious? Read Bayes and Deadweight: Using Statistics to Eject the Deadweight From Your Life next. 

The Art of Thinking Clearly

Rolf Dobelli’s book, The Art of Thinking Clearly, is a compendium of systematic errors in decision making. While the list of fallacies is not complete, it’s a great launching pad into the best of what others have already figured out.

To avoid frivolous gambles with the wealth I had accumulated over the course of my literary career, I began to put together a list of … systematic cognitive errors, complete with notes and personal anecdotes — with no intention of ever publishing them. The list was originally designed to be used by me alone. Some of these thinking errors have been known for centuries; others have been discovered in the last few years. Some came with two or three names attached to them. … Soon I realized that such a compilation of pitfalls was not only useful for making investing decisions but also for business and personal matters. Once I had prepared the list, I felt calmer and more levelheaded. I began to recognize my own errors sooner and was able to change course before any lasting damage was done. And, for the first time in my life, I was able to recognize when others might be in the thrall of these very same systematic errors. Armed with my list, I could now resist their pull — and perhaps even gain an upper hand in my dealings.

Dobelli’s goal is to learn to recognize and evade the biggest errors in thinking. In so doing, he believes we might “experience a leap in prosperity. We need no extra cunning, no new ideas, no unnecessary gadgets, no frantic hyperactivity—all we need is less irrationality.”

Let’s take a look at some of the content.

Guarding Against Survivorship Bias

People systematically overestimate their chances of success. Guard against it by frequently visiting the graves of once-promising projects, investments, and careers. It is a sad walk but one that should clear your mind.

Pattern Recognition

When it comes to pattern recognition, we are oversensitive. Regain your scepticism. If you think you have discovered a pattern, first consider it pure chance. If it seems too good to be true, find a mathematician and have the data tested statistically.

Fighting Against Confirmation Bias

[T]ry writing down your beliefs — whether in terms of worldview, investments, marriage, health care, diet, career strategies — and set out to find disconfirming evidence. Axing beliefs that feel like old friends is hard work but imperative.

Dating Advice and Contrast

If you are seeking a partner, never go out in the company of your supermodel friends. People will find you less attractive than you really are. Go alone or, better yet, take two ugly friends.

Think Different

Fend it off (availability bias) by spending time with people who think different than you do—people whose experiences and expertise are different from yours.

Guard Against Chauffeur Knowledge

Be on the lookout for chauffeur knowledge. Do not confuse the company spokesperson, the ringmaster, the newscaster, the schmoozer, the verbiage vendor, or the cliche generator with those who possess true knowledge. How do you recognize the difference? There is a clear indicator: True experts recognize the limits of what they know and what they do not know.

The Swimmer’s Body Illusion

Professional swimmers don’t have perfect bodies because they train extensively. Rather, they are good swimmers because of their physiques. How their bodies are designed is a factor for selection and not the result of their activities. … Whenever we confuse selection factors with results, we fall prey to what Taleb calls the swimmer’s body illusion. Without this illusion, half of advertising campaigns would not work. But this bias has to do with more than just the pursuit of chiseled cheekbones and chests. For example, Harvard has the reputation of being a top university. Many highly successful people have studied there. Does this mean that Harvard is a good school? We don’t know. Perhaps the school is terrible, and it simply recruits the brightest students around.

Peer Pressure

A simple experiment, carried out in the 1950s by legendary psychologist Solomon Asch, shows how peer pressure can warp common sense. A subject is shown a line drawn on paper, and next to it three lines—numbered 1, 2, and 3—one shorter, one longer, and one the same length as the original one. He or she must indicate which of the three lines corresponds to the original one. If the person is alone in the room, he gives correct answers because the task is really quite simple. Now five other people enter the room; they are all actors, which the subject does not know. One after another, they give wrong answers, saying “number 1,” although it’s very clear that number 3 is the correct answer. Then it is the subject’s turn again. In one-third of cases, he will answer incorrectly to match the other people’s responses

Rational Decision Making and The Sunk Cost Fallacy

The sunk cost fallacy is most dangerous when we have invested a lot of time, money, energy, or love in something. This investment becomes a reason to carry on, even if we are dealing with a lost cause. The more we invest, the greater the sunk costs are, and the greater the urge to continue becomes. … Rational decision making requires you to forget about the costs incurred to date. No matter how much you have already invested, only your assessment of the future costs and benefits counts.

Avoid Negative Black Swans

But even if you feel compelled to continue as such, avoid surroundings where negative Black Swans thrive. This means: Stay out of debt, invest your savings as conservatively as possible, and get used to a modest standard of living—no matter whether your big breakthrough comes or not

Disconfirming Evidence

Munger Destroy ideas

The confirmation bias is alive and well in the business world. One example: An executive team decides on a new strategy. The team enthusiastically celebrates any sign that the strategy is a success. Everywhere the executives look, they see plenty of confirming evidence, while indications to the contrary remain unseen or are quickly dismissed as “exceptions” or “special cases.” They have become blind to disconfirming evidence.

Still curious?

Read The Art of Thinking Clearly.

How You Can Instantly Improve Your Relationship

Most of us see what we want to see.

If we’re arguing with a spouse, we’re going to start seeing all of their faults. After all, it’s not my fault it’s your fault. Once we’ve labeled someone as, say, selfish, it becomes self-reinforcing thanks to the availability and confirmation bias. Our views become so clouded that we can’t appreciate the positive attributes about our partner.

Not only do we search for information that agrees with us but we fail to notice anything to the contrary. “We see,” writes Aaron Beck in his book Love is Never Enough, “each other through the bias of negative frames.”

There is a way for couples to fight the tendency to only notice what’s wrong: keep a “marriage diary,” with a list of all the things your partner does that you like.

In his book, Beck describes a couple, Karen and Ted, who are having marriage troubles. Beck suggested they keep a marriage diary.

After I proposed to Karen and Ted that each take notes of what the other did that was pleasing during the previous week, Karen reported the following:

Ted was great. I was really upset by some of my clients. They are a real pain. … Anyhow, I told Ted about it. He was very sympathetic. He didn’t try to tell me what to do. He said that if he was in my position, he would probably feel frustrated, too. He said that my clients are tough to deal with. I felt a lot better.

Each of Ted’s actions pleased Karen, who remarked, “They were like presents.” Although Ted had done similar things for Karen in the past, they had been erased from her memory because of the negative view of Ted.

The same was also true for Ted.

Psychologist Mark Kane Goldstein has used this method to help husbands and wives keep “track of their partner’s pleasant actions.”

Each spouse is given several sheets of graph paper on which to record whatever his or her partner does that is pleasing. The spouse rates these acts on a ten-point scale, indicating degree of satisfaction. Dr. Goldstein found that 70 percent of the couples who tried this simple method reported an improvement in their relationship.

Simply by shifting their focus away from the negative and onto little pleasures, couples were more aware of their satisfaction.

Psychologists call this “considering the opposite.”

The evolutionary roots of human behaviour

Anthony Gottlieb writing in the New Yorker:

Indeed, the guilty secret of psychology and of behavioral economics is that their experiments and surveys are conducted almost entirely with people from Western, industrialized countries, mostly of college age, and very often students of psychology at colleges in the United States. This is particularly unfortunate for evolutionary psychologists, who are trying to find universal features of our species. American college kids, whatever their charms, are a laughable proxy for Homo sapiens. The relatively few experiments conducted in non-Western cultures suggest that the minds of American students are highly unusual in many respects, including their spatial cognition, responses to optical illusions, styles of reasoning, coöperative behavior, ideas of fairness, and risk-taking strategies. Joseph Henrich and his colleagues at the University of British Columbia concluded recently that U.S. college kids are “one of the worst subpopulations one could study” when it comes to generalizing about human psychology. Their main appeal to evolutionary psychologists is that they’re readily available. Man’s closest relatives are all long extinct; breeding experiments on humans aren’t allowed (they would take far too long, anyway); and the mental life of our ancestors left few fossils.

He concludes:

Barash muses, at the end of his book, on the fact that our minds have a stubborn fondness for simple-sounding explanations that may be false. That’s true enough, and not only at bedtime. It complements a fondness for thinking that one has found the key to everything. Perhaps there’s an evolutionary explanation for such proclivities.

Still curious? Check out Barash’s book, Homo Mysterious: Evolutionary Puzzles of Human Nature, for yourself.