Tag: Confirmation bias

James March: The Ambiguities of Experience

In his book, The Ambiguities of Experience, James March explores the role of experience in creating intelligence.

Folk wisdom both trumpets the significance of experience and warns of its inadequacies.

On one hand, experience is thought to be the best teacher. On the other hand, experience is described as the teacher of fools, of those unable or unwilling to learn from accumulated knowledge. There is no need to learn everything yourself.

The disagreement between those aphorisms reflects profound questions about the human pursuit of intelligence through learning from experience.

“Since experience in organizations often suffers from weak signals, substantial noise, and small samples, it is quite likely that realized history will deviate considerably from underlying reality.”

— James March

March convincingly argues that although individuals and organizations are eager to derive intelligence from experience, the inferences stemming from that eagerness are often misguided.

The problems lie partly in errors in how people think, but even more so in properties of experience that confound learning from it. ‘Experience,’ March concludes, ‘may possibly be the best teacher, but it is not a particularly good teacher.’

Here are some of my notes from the book:

  • Intelligence normally entails two interrelated but somewhat different components. The first involves effective adaptation to an environment. The second: the elegance of interpretations of the experiences of life.
  • Since experience in organizations often suffers from weak signals, substantial noise, and small samples, it is quite likely that realized history will deviate considerably from the underlying reality.
  • Agencies write standards because experience is a poor teacher.
  • Constant exposure to danger without its realization leaves human beings less concerned about what once terrified them, and therefore experience can have the paradoxical effect of having people learn to feel more immune than they should to the unlikely dangers that surround them.
  • Generating an explanation of history involves transforming the ambiguities and complexities of experience into a form that is elaborate enough to elicit interest, simple enough to be understood, and credible enough to be accepted. The art of storytelling involves a delicate balancing of those three criteria
  • Humans have limited capabilities to store and recall history. They are sensitive to reconstructed memories that serve current beliefs and desires. They conserve belief by being less critical of evidence that seems to confirm prior beliefs than of evidence that seems to disconfirm them. They destroy both observations and beliefs in order to make them consistent. They prefer simple causalities, ideas that place causes and effects close to one another and that match big effects with big causes. 
  • The key effort is to link experience with a pre-existent accepted storyline so as to achieve a subjective sense of the understanding.
  • Experience is rooted in a complicated causal system that can be described adequately only by a description that is too complex for the human mind. The more accurately reality is reflected, the less comprehensible the story, and the more comprehensible the story, the less realistic it is.
  • Storytellers have their individual sources and biases, but they have to gain acceptance of their stories by catering to their audiences.
  • Despite the complications in extracting reality from experience, or perhaps because of them, there is a tendency for the themes of stories of management to converge over time.
  • Organizational stories and models are built particularly around four main mythic themes: rationality (the idea that the human spirit finds definitive expression through taking and justifying action in terms of its future consequences for prior values); hierarchy (the ideas that problems and actions can be decomposed into nested sets of subproblems and sub-actions such that interactions among them can be organized within a hierarchy); individual leader significance (the idea that any story of history must be related to a human project in order to be meaningful and that organizational human history is produced by the intentions of specific human leaders); and historical efficiency (the idea that history follows a path leading to a unique equilibrium defined by antecedent conditions and produced by competition.
  • Underlying many of these myths is a grand myth of human significance: the idea that humans can, through their individual and collective intelligence actions, influence the course of history to their advantage.
  • The myth of human significance produces the cruelties and generosities stemming from the human inclination to assign credit and blame for events to human intention.
  • There is an overwhelming tendency in American life to lionize or pillory the people who stand at the helms of our large institutions -to offer praise or level blame for outcomes over which they may have little control.
  • An experienced scholar is less inclined to claim originality than is a beginner.
  • …processes of adaptation can eliminate sources of error but are inefficient in doing so.
  • Knowledge is lost through turnover, forgetting, and misfiling, which assure that at any point there is considerable ignorance. Something that was once known is no longer known. In addition, knowledge is lost through its incomplete accessibility.
  • A history of success leads managers to a systematic overestimation of the prospects for success in novel endeavors. If managers attribute their success to talent when they are, in fact, a joint consequence of talent and good fortune, successful managers will come to believe that they have capabilities for beating the odds in the future as they apparently have had in the past.
  • In a competitive world of promises, winning projects are systematically projects in which hopes exceed reality
  • The history of organizations cycling between centralization and decentralization is a tribute, in part, to the engineering difficulty of finding an enduring balance between the short-run and local costs and the long-run and more global benefits of boundaries.
  • The vividness of direct experience leads learners to exaggerate the information content of personal experience relative to other information.
  • The ambiguities of experience take many forms but can be summarized in terms of five attributes: 1) the causal structure of experience is complex; 2) experience is noisy; 3) history includes numerous examples of endogeneity, causes in which the properties of the world are affected by actions adapting to it; 4) history as it is known is constructed by participants and observers; 5) history is miserly in providing experience. It offers only small samples and thus large sampling error in the inferences formed.
  • Experience often appears to increase significantly the confidence of successful managers in their capabilities without greatly expanding their understanding.

Still interested? Want to know more? Buy the book. Read Does Experience Make you an Expert? next. 

Max Bazerman — You Are Not As Ethical As You Think

Ethical infractions are rooted in the intricacies of human psychology rather than integrity.

Max Bazerman’s book: Blind Spots will certainly make you think about your own actions more objectively.

Briefly, here are some of my takeaways.

  • We engage in behavioral forecasting errors. We believe we will behave a certain way in a certain situation. Yet, when actually faced with that situation we behave differently.
  • We are experts at deflecting blame and rationalizing our behavior in a positive light. A used car salesman can view himself as ethical despite selling someone a car that leaks oil, by noting the buyer failed to ask the right questions (bias from self-interest).
  • People often judge the ethicality of actions based on the outcome (outcome bias). We tend to be far more concerned with and show more sympathy when the actions taken affect “identifiable victims”.
  • Motivated blindness (when one party has an interest in overlooking the unethical behavior of another party) explains the financial crisis (bias from self-interest).
  • Research finds that cognitively busy people are more likely to cheat on a task than those who are less overloaded. Why? Because it takes cognitive effort to be reflective enough to skip the influence to cheat. Our brains are predisposed to make quick decisions and in the process, they can fail to consider outside influences (such as ethical concerns). We also behave differently when facing a loss than a gain. We’re more willing to cheat when we’re trying to avoid a loss.
  • Snap decisions are especially prone to unconscious bias. The less time we have to think the more likely we default to in-group preference (racial stereotypes). When instructed to shoot “criminals” and not unarmed citizens one study found that participants incorrectly shot more black men than white men.
  • Research shows that most people view their own input into a group, their division’s input to the overall organization, and their firm’s contributions to a strategic alliance to be more important and substantial than reality can sustain. Over-claiming this credit is, at least partly rooted in our bounded ethicality. That is, we exclude important and relevant information from our decisions by placing arbitrary and functional bounds around our definition of a problem (normally in a self-serving manner). This is part of the reason we fail to see eye to eye in disagreements — we pay attention to different data.
  • The difference in the way information is processed is often not intentional. Confirmation bias helps our minds absorb information that is in agreement with our beliefs and discount information that may contradict our thoughts. (We can’t remember our previous intentions either; How Our Brains Make Memories).
  • Egocentrism is dangerous when playing a Tragedy of the Commons game (Social Dilemma) such as the one we’re currently playing with debt and the environment as it encourages us to over claim resources.
  • In the end the kindergarten rule of fairness applies: one person cuts the cookie and the other has first pick on which half to eat.
  • In social dilemmas the easiest strategy is to defect.
  • A whole host of societal problems result from our tendency to use an extremely high discount rate regarding the future. One result is that we save far too little for retirement. Over-discounting the future can be immoral too as it robs future generations of opportunities and resources.
  • Compliance programs often include sanctioning systems that attempt to discourage unethical behavior, typically though punishment. Yet these programs often have the reverse effect, encouraging the behavior they are supposed to discourage. Why? In short because it removes the ethical consideration and makes it a business decision. (The number of late pick ups at daycares increase when there is a fine.)
  • When your informal culture doesn’t line up with your formal culture you have blind spots and employees will follow the informal culture.
  • Of course, we’re overconfident so informing us about our blind spots doesn’t seem to help us make better choices. We tend to believe that while others may fall prey to psychological biases, we don’t. Left to our own devices we dramatically understate the degree to which our own behavior is affected by incentives and situational factors.

***

Still curious? Check out Blind Spots. This book will help you see how your biases lead to your own immoral actions. And if you’re still curious try: Bounded Ethicality: The Perils of Loss Framing.

Lessons of the Past

The tendency to relate contemporary events to earlier events as a guide to understanding is a powerful one. The difficulty, of course, is in being certain that two situations are truly comparable. Because they are similar in some respects does not assure us that they are similar in all respects.

***

When we try to understand contemporary events, we often relate them to ones from the past. We do this within the context of our own lives and within the context of human history as a whole. We try to learn from our own mistakes and those of our ancestors.

The issue is that it can be hard to be sure if what is happening now and what happened before are truly comparable. They may be similar in some respects, but we cannot know if they are similar in every meaningful way.

The way we set policy is often flawed. The underrated historian Ernest May argues that we attempt to avoid the mistakes of previous generations by pursuing policies that would have made sense in the past but do not today. May wrote that lawmakers make analogies with history. However, they tend to seize upon the first which comes to mind, without considering differences. They reject disconfirming evidence. In his book, Lessons From the Past, he traces the impact of historical analogy on US foreign policy.

He found that because of reasoning by analogy, US policymakers tend to be one generation behind, determined to avoid the mistakes of the previous generation. They pursue the policies that would have been most appropriate in the historical situation but are not necessarily well adapted to the current one.

Policymakers in the 1930s, for instance, viewed the international situation as analogous to that before World War I. Consequently, they followed a policy of isolation that would have been appropriate for preventing American involvement in the first World War but failed to prevent the second. Communist aggression after World War II was seen as analogous to Nazi aggression, leading to a policy of containment that could have prevented World War II.

The Vietnam analogy had been used repeatedly over many years to argue against an activist US foreign policy. For example, some used the Vietnam analogy to argue against US participation in the Gulf War–a flawed analogy because the operating terrain over which battles were fought was completely different in Kuwait/Iraq and much more in our favor there as compared with Vietnam.

May argues that policymakers often perceive problems in terms of analogies with the past, but that they ordinarily use history badly: When resorting to an analogy, they tend to seize upon the first that comes to mind. They do not research more widely. Nor do they pause to analyze the case, test its fitness, or even ask in what ways it might be misleading.

Information Without Context

Information without context is falsely empowering and incredibly dangerous.

As an adult, have you ever picked up a child’s shape-sorter and tried to put the square item through the round hole? Of course not. Adults know better — or at least we’re supposed to. Yet we often take square solutions and cram them into round problems.

Consider, for example, a project that falls behind schedule. A project manager is apt to adopt whatever solution worked the last time a project was falling behind schedule. If more people were added last time and that produced a successful outcome why not do it again? Our tendency to stick with what has worked in the past, regardless of why it worked, creates a powerful illusion that we are solving the problem or doing the right thing.

When posed a difficult question by an informed reporter, politicians often answer something related but simpler. The politician treats what should be a complex topic as something black and white and portrays the topic as simpler than it really is (reductive bias). In the corporate world we do the same thing when we take something that worked previously (or somewhere else) and blindly apply it to the next problem without giving due consideration to why it worked.

Maybe we’re just becoming an intellectually lazy society constantly looking for then next soundbite from “experts” on how to do something better.  We like the easy solution.

In Think Twice, Michael Mauboussin writes: “Consultants, researchers, and practitioners often observe some success, seek common attributes among them and proclaim that those attributes can lead others to succeed. This simply does not work.”

Our brains may be adult, yet we demonstrate a very childlike level of consideration. Decision makers often fail to ask key questions, such as: What’s different about this project? Under which circumstances is adding more people likely to work? and, Am I doing this because someone else is doing it?

Adopting best practices has become the reason to do something in and of itself.  It is, after all, hard to challenge logic of best practices. But what do best practices mean? Whom are they best for? What makes them successful? Can we replicate them in our company? Culture? Circumstance? Do we have the necessary skills? What are the side effects? What are the incentives? … More often than not, we embrace a solution without understanding under which conditions it succeeds or fails.

I think there are some parallels between business decision making and medicine. In Medicine our understanding of the particulars can never be complete: misdiagnosing a patient is common so doctors look at each patient as a new mystery.

A doctor, applying the same thoughtlessness spewed by management consultants might, reasonably, determine that all people with a fever have a cold. However, we know people are more complex than this simple correlation. Medical practitioners know the difference between correlation and cause. A fever by itself tells the doctor something but not everything. It could indicate a cold and it could be something more serious. Doctors, like good decision makers, check the context and seek out information that might disprove their diagnosis.