Tag: Complexity

Imitation Without Understanding Does Not Work

Don’t imitate large firms, just because they are large.

Large, prestigious, and successful firms are chosen not only on the assumption that following them will produce better results but also as a way to bring a measure of legitimacy, something that is especially important during times of high uncertainty.

Everything is connected.

[C]ontextualizing is about viewing imitation opportunities not as isolated atoms but as the interrelated parts of the complex system within which they are embedded and that explains, or conditions, their form and outcome. … [I]mitation without understanding the context of the product or service does not work, because it does not take into account necessary adjustments to the key environmental peculiarities that vary between the model and the imitator.

Outcome based imitation (copying whatever you think produces results) is bound to fail.

A focus on outcomes without an understanding of the means and process to produce them will lead to “blind imitation of the perceived survivors rather than a deliberate effort to construct causal theories about how to thrive”…

This means you can’t do the “grab bag” approach — you can’t just say we’re going to copy ideas from Google, Apple, and Facebook and assume that approach will lead to success. The processes or ideas you’re copying from Google likely have inherent conflicts with your company’s existing processes as well as the one’s your trying to borrow from Apple.

Let’s say you’ve just been appointed CEO of Hewlett-Packard and the board has asked you to come up with a strategy. It would be easy to come back and say we’re going to focus on three things: (1) design our products better (e.g., Apple); (2) focus on R&D (e.g., IBM); and (3) become the low cost producer (e.g., Dell). That strategy sounds great, but those things all have inherent conflicts with one another. (Still curious? Read a primer on strategy)

Trying to combine contradictory models/processes is a rudimentary form of imitation and a road towards failure. Part of the management dilemma when trying to imitate others is realizing that your internal systems likely can’t co-exist with the ones you’re trying to copy.

— Excerpts from Copycats: How Smart Companies Use Imitation to Gain a Strategic Edge

When Networks Network

A great article in Science News by Elizabeth Quill on the connectivity and complexity of networks. A great example of this is the 2008 power station shutdown in Italy that led to the failure of a communications network, causing another power station to go down and “triggering an electrical blackout affecting much of Italy.”

“When we think about a single network in isolation, we are missing so much of the context,” says Raissa D’Souza, a physicist and engineer at the University of California, Davis. “We are going to make predictions that don’t match real systems.”

Like their single-network counterparts, networks of networks show up everywhere. By waking up in the morning, going to work and using your brain, you are connecting networks. Same when you introduce a family member to a friend or send a message on Facebook that you also broadcast via Twitter. In fact, anytime you access the Internet, which is supported by the power grid, which gets its instructions via communications networks, you are relying on interdependent systems. And if your 401(k) lost value during the recent recession, you’re feeling the effects of such systems gone awry.

Findings so far suggest that networks of networks pose risks of catastrophic danger that can exceed the risks in isolated systems. A seemingly benign disruption can generate rippling negative effects. Those effects can cost millions of dollars, or even billions, when stock markets crash, half of India loses power or an Icelandic volcano spews ash into the sky, shutting down air travel and overwhelming hotels and rental car companies. In other cases, failure within a network of networks can mean the difference between a minor disease outbreak or a pandemic, a foiled terrorist attack or one that kills thousands of people.

Understanding these life-and-death scenarios means abandoning some well-established ideas developed from single-network studies. Scientists now know that networks of networks don’t always behave the way single networks do.

Simplicity’s Best Friend: Small Groups of Smart People

More brains don’t necessarily lead to better ideas. When it came to leading meetings, Jobs had no qualms about tossing the least necessary person out of the room.

An excerpt from Insanely Simple: The Obsession That Drives Apple’s Success:

One particular day, there appeared in our midst a woman from Apple with whom I was unfamiliar. I don’t recall her name, as she never appeared in our world again, so for the purposes of this tale, I’ll call her Lorrie. She took her seat with the rest of us as Steve breezed into the boardroom, right on time. Steve was in a sociable mood, so we chatted it up for a few minutes, and then the meeting began. “Before we start, let me just update you on a few things,” said Steve, his eyes surveying the room. “First off, let’s talk about iMac–” He stopped cold. His eyes locked on to the one thing in the room that didn’t look right. Pointing to Lorrie, he said, “Who are you?”

Lorrie was a bit stunned to be called out like that, but she calmly explained that she’d been asked to attend because she was involved with some of the marketing projects we’d be discussing. Steve heard it. Processed it. Then he hit her with the Simple Stick. “I don’t think we need you in this meeting, Lorrie. Thanks,” he said. Then, as if that diversion had never occurred–and as if Lorrie never existed–he continued with his update. So, just as the meeting started, in front of eight or so people whom Steve did want to see at the table, poor Lorrie had to pack up her belongings, rise from her chair, and take the long walk across the room toward the door. Her crime: She had nothing to add.

Simplicity’s Best Friend: Small Groups of Smart People

Start with small groups of smart people–and keep them small. Every time the body count goes higher, you’re simply inviting complexity to take a seat at the table. The small-group principle is deeply woven into the religion of Simplicity. It’s key to Apple’s ongoing success and key to any organization that wants to nurture quality thinking. The idea is pretty basic: Everyone in the room should be there for a reason. There’s no such thing as a “mercy invitation.” Either you’re critical to the meeting or you’re not. It’s nothing personal, just business.

Steve Jobs actively resisted any behavior he believed representative of the way big companies think–even though Apple had been a big company for many years. He knew that small groups composed of the smartest and most creative people had propelled Apple to its amazing success, and he had no intention of ever changing that. When he called a meeting or reported to a meeting, his expectation was that everyone in the room would be an essential participant. Spectators were not welcome.

This was based on the somewhat obvious idea that a smaller group would be more focused and motivated than a large group, and smarter people will do higher quality work. For a principle that would seem to be common sense, it’s surprising how many organizations fail to observe it. How many overpopulated meetings do you sit through during the course of a year? How many of those meetings get sidetracked or lose focus in a way that would never occur if the group were half the size? The small-group rule requires enforcement, but it’s worth the cost.

Remember, complexity normally offers the easy way out. It’s easier to remain silent and let the Lorries of the world take their seats at the table, and most of us are too mannerly to perform a public ejection. But if you don’t act to keep the group small, you’re creating an exception to the rule–and Simplicity is never achieved through exceptions. Truthfully, you can do the brutal thing without being brutal. Just explain your reasons. Keep the group small.

Prior to working with Steve Jobs, I worked with a number of more traditional big companies. So it was a shock to my system (in a good way) when I entered Steve’s world of Simplicity. In Apple’s culture, progress was much easier to attain. It was also a shock to my system (in a bad way) when I left Steve’s world and found myself suffering through the same old issues with more traditional organizations again.

And

Out in the real world, when I talk about small groups of smart people, I rarely get any pushback. That’s because common sense tells us it’s the right way to go. Most people know from experience that the fastest way to lose focus, squander valuable time, and water down great ideas is to entrust them to a larger group. Just as we know that there is equal danger in putting ideas at the mercy of a large group of approvers.

One reason why large, unwieldy groups tend to be created in many companies is that the culture of a company is bigger than any one person. It’s hard to change “the way we do things here.” This is where the zealots of Simplicity need to step in and overcome the inertia. One must be judicious and realistic about applying the small-group principle. Simply making groups smaller will obviously not solve all problems, and “small” is a relative term. Only you know your business and the nature of your projects, so only you can draw the line between too few people and too many. You need to be the enforcer and be prepared to hit the process with the Simple Stick when the group is threatened with unnecessary expansion.

Still curious? Learn more about Apple’s culture by reading Insanely Simple: The Obsession That Drives Apple’s Success.

source

Susan Sontag: Aphorisms and the Commodification of Wisdom

A brilliant post from brain pickings drawing our attention to Susan Sontag and the commodification of wisdom.

As the interconnectedness and velocity of information continue to grow, these passages from Sontang’s As Consciousness Is Harnessed to Flesh speak to our desire to reduce complexity into soundbites. Soundbites, however, are designed to discourage critical thinking; we’re expected to get it and move on. But that’s not the way the world works. Simplifying complexity prevents informed conversations.

April 26, 1980

Aphorisms are rogue ideas.

Aphorism is aristocratic thinking: this is all the aristocrat is willing to tell you; he thinks you should get it fast, without spelling out all the details. Aphoristic thinking constructs thinking as an obstacle race: the reader is expected to get it fast, and move on. An aphorism is not an argument; it is too well-bred for that.

To write aphorisms is to assume a mask — a mask of scorn, of superiority. Which, in one great tradition, conceals (shapes) the aphorist’s secret pursuit of spiritual salvation. The paradoxes of salvation. We know at the end, when the aphorist’s amoral, light point-of-view self-destructs.

Ten days later she added:

One wonders why. Can it be that the literature of aphorisms teaches us the sameness of wisdom (as anthropology teaches us the diversity of culture)? The wisdom of pessimism. Or should we rather conclude that the form of the aphorism, of abbreviated or condensed or rogue thought, is a historically-colored voice which, when adopted, inevitably suggests certain attitudes; is the vehicle of a common thematics?

Aphoristic thinking is impatient thinking …

Follow your curiosity and check out three steps to refuting any argument and order a copy of As Consciousness Is Harnessed to Flesh: Journals and Notebooks, 1964-1980.

The Difference Between a Puzzle and a Mystery?

An eloquent explanation on the difference between mysteries and puzzles by Gregory Treverton:

There’s a reason millions of people try to solve crossword puzzles each day. Amid the well-ordered combat between a puzzler’s mind and the blank boxes waiting to be filled, there is satisfaction along with frustration. Even when you can’t find the right answer, you know it exists. Puzzles can be solved; they have answers.

But a mystery offers no such comfort. It poses a question that has no definitive answer because the answer is contingent; it depends on a future interaction of many factors, known and unknown. A mystery cannot be answered; it can only be framed, by identifying the critical factors and applying some sense of how they have interacted in the past and might interact in the future. A mystery is an attempt to define ambiguities.

We may like puzzles better, but the world increasingly offers mysteries.

In a 2007 New Yorker article “Open Secrets,” written by Malcolm Gladwell expands on the distinction:

The problem of what would happen in Iraq after the toppling of Saddam Hussein was, by contrast, a mystery. It wasn’t a question that had a simple, factual answer. Mysteries require judgments and the assessment of uncertainty, and the hard part is not that we have too little information but that we have too much.

Gladwell goes on to show how understanding the difference between puzzles and mysteries can lead us to interpret the same facts differently. “If you sat through the trial of Jeffrey Skilling,” Gladwell writes, “you’d think that the Enron scandal was a puzzle.” But Enron wasn’t really a puzzle it was a mystery.

In his book, Boombustology, Vikram Mansharamani sums up the Enron situation:

…the truth about Enron’s transactions was openly reveled in public filings and all it took was a diligent Wall Street Journal reporter to unveil the issues at hand. The needed capability was not the ability to find particular information, but rather the skill to assemble disparate data point into a clear image of the whole. The problem is not one of inadequate information, but instead one of too much information overwhelming the processing capabilities…

Mental Model: Conjunctive and Disjunctive Events Bias

Welcome to the conjunctive and disjunctive events bias.

Why are we so optimistic in our estimation of a projects cost and schedule? Why are we so surprised when something inevitably goes wrong? Because of the human tendency to underestimate disjunctive events.

According to Daniel Kahneman and his long-time co-author Amos Tversky (1974): “A complex system, such as a nuclear reactor or the human body, will malfunction if any of its essential components fails.” They continue, “Even when the likelihood of failure in each component is slight, the probability of an overall failure can be high if many components are involved.”

In Seeking Wisdom: From Darwin to Munger, Peter Bevelin writes:

A project is composed of a series of steps where all must be achieved for success. Each individual step has some probability of failure. We often underestimate the large number of things that may happen in the future or all opportunities for failure that may cause a project to go wrong. Humans make mistakes, equipment fails, technologies don’t work as planned, unrealistic expectations, biases including sunk cost-syndrome, inexperience, wrong incentives, changing requirements, random events, ignoring early earning signals are reasons for delays, cost overruns and mistakes. Often we focus too much on the specific base project case and ignore what normally happens in similar situations (base rate frequency of outcomes—personal and others). Why should some project be any different from the long-term record of similar ones? George Bernard Shaw said: “We learn from history that man can never learn anything from history.”

The more independent steps that are involved in achieving a scenarios, the more opportunities for failure and the less likely it is that the scenario will happen. We often underestimate the number of steps, people, and decisions involved.

Add to this that we often forget that the reliability of a system is a function of the whole system. The weakest link sets the upper limit for the whole chain.

* * *

In “Thinking, Fast and Slow“, Kahneman offers the following example:

Linda is thirty-one years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with the issues of discrimination and social justice, and also participated in antinuclear demonstrations.

Which alternative is most probable:
Linda is a bank teller.
Linda is a bank teller and is active in the feminist movement.

…About 85% to 95% of undergraduates at several major universities chose the second option, contrary to logic.

The word fallacy is used, in general, when people fail to apply a logical rule that is obviously relevant. Amos and I introduced the idea of a conjunction fallacy, which people commit when they judge a conjunction or two events to be more probably than one of the events in a direct comparison.

…The judgments of probability that our respondents offered, (in the Linda problem), corresponded precisely to the judgments of representativeness (similarity to stereotypes). Representativeness belongs to a cluster of closely related basic assessments that are likely to be generated together. The most representative outcomes combine with the personality description to produce the most coherent stories. The most coherent stories are not necessarily the most probable, but they are plausible, and the notions of coherence, plausibility, and probability are easily confused by the unwary.

* * *

From Max Bazerman’s Judgment in Managerial Decision Making:

Which of the following instances appears most likely? Which appears second most likely?
A. Drawing a red marble from a bag containing 50 percent red marbles and 50 percent white marbles.
B. Drawing a red marble seven times in succession, with replacement (i.e., a selected marble is put back into the bag before the next marble is selected), from a bag containing 90 percent red marbles and 10 percent white marbles.
C. Drawing at least one red marble in seven tries, with replacement, from a bag containing 10 percent red marbles and 90 percent white marbles.

The most common ordering of preference is B-A-C. Interestingly, the correct order of likelihood is C (52%), A (50%), and B (48%). This result illustrates a general bias to overestimate the probability of conjunctive events, or events that must occur in conjunction with one another and to underestimate the probability of disjunctive events, or events that occur independently. Thus, when multiple events all need to occur (choice B) we overestimate the true likelihood of this happening, while if only one of many events needs to occur (choice C), we underestimate the true likelihood of this occurring.

The overestimation of conjunctive events offers a powerful explanation for the problems that typically occur with projects that require multistage planning. Individuals, businesses, and governments frequently fall victim to the conjunctive-events bias in terms of timing and budgets. Home remodelling, new product ventures, and public works projects seldom finish on time.

* * *

Astronomy Professor Carl Sagan said in Carl Sagan: A Life in the Cosmos: “The Chernobyl and Challenger disasters remind us that the highly visible technological systems in which enormous national prestige had been invested can nevertheless experience catastrophic failure.”

Safety is a function of the total system – that is the interaction of all of the components. If only one component fails (and it need not be a key one), the system may fail. Assume an airplane has 2,000 independent parts or systems. Each part or system is designed to have a working probability of 99%. All the parts need to work together for the airplane to work, however the probability that at least one of the parts fails (causing the plane to malfunction) is 86%.

* * *

From Judgment and Decision Making, by Daniel Kahneman (1974):

Studies of choice among gamblers and of judgments of probability indicate that people tend to overestimate the probability of conjunctive events (Cohen, Chesnick, and Haran, 1972, 24) and to underestimate the probability of disjunctive events. These biases are readily explained as effects of anchoring.

The stated probability of the elementary event (success at any one stage) provides a natural starting point for the estimation of the probabilities of both conjunctive and disjunctive events. Since adjustment from the starting point is typically insufficient, the final estimates remain too close to the probabilities of the elementary events in both cases.

Note the overall probability of a conjunctive event is lower than the probability of each elementary event, whereas the overall probability of a disjunctive event is higher than the probability of each elementary event. As a consequence of anchoring, the overall probability will be overestimated in conjunctive problems and underestimated in disjunctive problems.

Biases in the evaluation of compound events are particularly significant in the context of planning. The successful completion of an undertaking, such as the development of a new product or thesis, typically has a conjunctive character: for the undertaking to succeed, each of a series of events must occur. Even when each of these events is very likely, the overall probability of success can be quite low if the number of events is large.

The general tendency to overestimate the probability of conjunctive events leads to unwarranted optimism in the evaluation of the likelihood that a plan will success or that a project will be completed on time. Conversely, disjunctive structures are typically encountered in the evaluation of risks. A complex system, such as a nuclear reactor or the human body, will malfunction if any of its essential components fails. Even when the likelihood of failure in each component is slight, the probability of an overall failure can be high if many components are involved. Because of anchoring, people will tend to underestimate the probabilities of failure in complex systems. Thus, the direction of the anchoring bias can sometimes be inferred from the structure of the event. The chain-like structure of conjunctions leads to over-estimation, the funnel-like structure of disjunctions leads to underestimation.

* * *

Biases in the evaluation of compound events are particularly significant in the context of planning. Any complex undertaking has the character of a conjunctive event: lots of things have to click into place in order for the whole thing to work. Even when the probability of each individual event is very likely, the overall probability can be very low. People in general way overestimate the probability of the conjunctive event, leading to massive time and cost overruns in real projects.

Conversely, disjunctive structures are typically encountered in the evaluation of risks. A complex system, such as a nuclear reactor or a human body, will malfunction if just one key component fails. Even if the probability of failure of any one event is very low, the overall probability of some event going wrong is very high. People always underestimate the probability of complex systems, like the Challenger, going wrong. (source)

Conjunctive and Disjunctive Events Bias is part of the Farnam Street Latticework of Mental Models.