Tag: Complexity

Poaching Stars is a Terrible Idea to Improve Performance

In an effort to improve performance we often turn to the simple answer of trying to hire a star from another organization. This sounds like a great idea, is hard to argue with, and offers the promise of an instant performance boost.

In practice, most of the benefits turn out to be illusory.

The question is why?

One reason is that we think of the person as an isolated system when in reality they are not. The surrounding team, culture, and environment can amplify their success.

In his wonderful book,  Think Twice: Harnessing the Power of Counterintuition,

Michael Mauboussin explains:

A star’s performance relied to some degress on the people, structure, and norms around him—the system. Analyzing results requires sorting the relative contributions of the individual versus the system, something we are not particularly good at. When we err, we tend to overstate the role of the individual.

This mistake is consequential because organizations routinely pay big bucks to lure high performers, only to be sorely disappointed. In one study, a trio of professors from Harvard Business School tracked more than one thousand acclaimed equity analysts over a decade and monitored how their performance changes as they switched firms. Their dour conclusion, “When a company hires a star, the star’s performance plunges, there is a sharp decline in the functioning of the group or team the person works with, and the company’s market value falls.” The hiring organization is let down because it failed to consider systems-based advantages that the prior employer supplied, including firm reputation and resources. Employers also underestimate the relationships that supported previous success, the quality of the other employees, and a familiarity with past processes.

What’s happening a common mistake — we’re focusing on an isolated part of a complex adaptive system without understanding how that part contributes to the overall system dynamics.

For more information read the Harvard Business Review article: The Risky Business of Hiring Stars.

Five Must-Reads for Tackling Complex Problems

Ted Cadsby writes “the following five books are a small sample from a longer list of must-reads, but they have two things in common. First, they forced me to confront how superficial and inadequate my thinking was in assessing different kinds of complex problems. Second, they took the important next step of introducing more sophisticated approaches to tackling complexity, which I have been using ever since.”

The Black Swan, by Nassim Nicholas Taleb

…Like any outstanding book, the scope and depth of its ideas cannot be fairly summarized, but his central argument is that we live in two worlds. The first world can be described by basic statistical analysis and a common-sense version of cause-effect relationships; it is a world in which we can make fairly accurate predictions. But the second world behaves in ways that cannot be described in the same straightforward manner, and is not amenable to reliable predictions. We are typically blind to this second world because we force-fit our basic intuitions onto it, based on the naïve assumption that we can understand it the way we understand the first world.

Expert Political Judgment, by Philip Tetlock

…While many of our day-to-day predictions are dependable, an increasing number are not, because they are pitted against increasing complexity in our lives. Tetlock has studied how poor our forecasts are when it comes to making predictions in the domain of economics and politics. His research reveals, in highly analytic and rigorous detail, the ineptitude of the “experts” — in fact he shows that the more expert someone is, the less reliable their predictions tend to be.

The Fifth Discipline, by Peter Senge

…Although Senge’s book was first published over 20 years ago, it remains one of the best explanations of “systems thinking” to analyzing problems. Senge shows how the complex aspects of the world and our lives are much more productively described as systems than as linear cause-and-effect relationships — better as multiple causal factors that influence each other through intricate feedback loops that generate behaviors that are not straightforward.

Emotional Intelligence, by Daniel Goleman

Building on the work of the neuroscientists who pioneered this field, he uncovers one of our most significant cognitive frailties — poor management of emotion — and explores methods of mitigating it. Like Senge’s book, Goleman’s initial edition goes back a number of years; but also like Senge’s, it not only is still current, it is still one of the best overviews of this topic.

The Halo Effect, by Phil Rosenzweig

…brilliantly reveals the flaws in just about every best-selling strategy book of the past three decades. Second, and more importantly, it reveals just how skeptical and sharp-minded today’s business leaders must be in order to avoid falling victim to the latest and greatest guru thinking. Rosenzweig exposes how convincing but faulty the logic is of the brightest and most popular business consultants. Reading his deconstruction of their research and arguments is shocking but liberating — in much the same way that a child experiences the revelation that there is no Tooth Fairy or that magic tricks are just illusions. The book excels at revealing a lesson that cannot be repeated enough: The most persuasive and researched arguments are often the most specious.

Footnotes
  • 1

    Sources:

    http://blogs.hbr.org/cs/2011/10/five_must-reads_for_tackling_c.html

    and “flickr.com/photos/dominik99/384027019”

Suppressing Volatility Makes the World Less Predictable and More Dangerous

I recommend reading Nassim Taleb’s recent article (PDF) in Foreign Affairs. It’s the ultimate example of iatrogenics by the fragilista.

If you don’t have time here are my notes:

  • Complex systems that have artificially suppressed volatility tend to become extremely fragile, while at the same time exhibiting not visible risks.
  • Seeking to restrict variability seems to be good policy (who does not prefer stability to chaos?), so it is with very good intentions that policymakers unwittingly increase the risk of major blowups.
  • Because policymakers believed it was better to do something than to do nothing, they felt obligated to heal the economy rather than wait and see if it healed on its own.
  • Those who seek to prevent volatility on the grounds that any and all bumps in the road must be avoided paradoxically increase the probability that a tail risk will cause a major explosion. Consider as a thought experiment a man placed in artificially sterilized environment for a decade and then invited to take a ride on a crowded subway; he would be expected to die quickly.
  • But although these controls might work in some rare situations, in the long-term effect of any such system is an eventual and extremely costly blowup whose cleanup costs can far exceed the benefits accrued.
  • … Government interventions are laden with unintended—and unforeseen—consequences, particularly in complex systems, so humans must work with nature by tolerating systems that absorb human imperfections rather than seek to change them.
  • Although it is morally satisfying, the film (inside job) naively overlooks the fact that humans have always been dishonest and regulators have always been behind the curve.
  • Humans must try to resist the illusion of control: just as foreign policy should be intelligence-proof (it should minimize its reliance on the competence of information-gathering organizations and the predictions of “experts” in what are inherently unpredictable domains), the economy should be regulator-proof, given that some regulations simply make the system itself more fragile.
  • The “turkey problem” occurs when a naive analysis of stability is derived from the absence of past variations.
  • Imagine someone who keep adding sand to a sand pile without any visible consequence, until suddenly the entire pile crumbles. It would be foolish to blame the collapse on the last grain of sand rather than the structure of the pile, but that is what people do consistently, and that is the policy error.
  • As with a crumbling sand pile, it would be foolish to attribute the collapse of a fragile bridge to the last truck that crossed it, and even more foolish to try to predict in advance which truck might bring it down.
  • Obama’s mistake illustrates the illusion of local causal chains—that is, confusing catalysts for causes and assuming that one can known which catalyst will produce which effect.
  • Governments are wasting billions of dollars on attempting to predict events that are produced by interdependent systems and are therefore not statistically understandable at the individual level.
  • Most explanations being offered for the current turmoil in the Middle East follow the “catalysts as causes” confusion. The riots in Tunisia and Egypt were initially attributed to rising commodity prices, not to stifling and unpopular dictatorships.
  • Again, the focus is wrong even if the logic is comforting. It is the system and its fragility, not events, that must be studied—what physicists call “percolation theory,” in which the properties of the terrain are studied rather than those of a single element of the terrain.
  • Humans fear randomness—a healthy ancestral trait inherited from a different environment. Whereas in the past, which was a more linear world, this trait enhanced, fitness and increased changes of survival, it can have the reverse effect in today’s complex world, making volatility take the shape of nasty Black Swans hiding behind deceptive periods of “great moderation.”
  • But alongside the “catalysts as causes” confusion sit tow mental biases: the illusion of control and the action bias (the illusion that doing something is always better than doing nothing.) This leads to the desire to impose man-made solutions. Greenspan’s actions were harmful, but it would have been hard to justify inaction in a democracy where the incentive is to always promise a better outcome than the other guy, regardless of the actual delayed cost.
  • As Seneca wrote in De clementia, “repeated punishment, while it crushes the hatred of a few, stirs the hatred of all … just as trees that have been trimmed throw out again countless branches.”
  • The Romans were wise enough to know that only a free man under Roman law could be trusted to engage in a contract; by extension, only a free people can be trusted to abide by a treaty.
  • As Jean-Jacques Rousseau put it, “A little bit of agitation gives motivation to the soul, and what really makes the species prosper is not peace so much as freedom.” With freedom comes some unpredictable fluctuation. This is one of life’s packages: there is no freedom without noise—and no stability without volatility.

***

Still curious? Nassim Taleb newest book is Antifragile: Things That Gain from Disorder. He is also the author of The Black SwanFooled By Randomness, and The Bed of Procrustes.

Tight Coupling and Complexity

From The London Review of Books comes an article on the rise of algorithmic trading:

Systems that are both tightly coupled and highly complex, Perrow argues in Normal Accidents, are inherently dangerous. Crudely put, high complexity in a system means that if something goes wrong it takes time to work out what has happened and to act appropriately. Tight coupling means that one doesn’t have that time. Moreover, he suggests, a tightly coupled system needs centralised management, but a highly complex system can’t be managed effectively in a centralised way because we simply don’t understand it well enough; therefore its organisation must be decentralised. Systems that combine tight coupling with high complexity are an organisational contradiction, Perrow argues: they are ‘a kind of Push-me-pul-lyou out of the Doctor Dolittle stories (a beast with heads at both ends that wanted to go in both directions at once).

Perrow’s theory is just that, a theory. It has never been tested very systematically, and certainly never proved conclusively, but it points us in a necessary direction. When thinking about automated trading, it’s easy to focus too narrowly, either pointing complacently to its undoubted benefits or invoking a sometimes exaggerated fear of out of control computers. Instead, we have to think about financial systems as a whole, desperately hard though that kind of thinking may be. The credit system that failed so spectacularly in 2007-8 is slowly recovering, but governments have not dealt with the systemic flaws that led to the crisis, such as the combination of banks that are too big to be allowed to fail and ‘shadow banks’ (institutions that perform bank-like functions but aren’t banks) that are regulated too weakly. Share trading is another such system: it is less tightly interconnected in Europe than in the United States, but it is drifting in that direction here as well. There has been no full-blown stock-market crisis since October 1987: last May’s events were not on that scale.[*] But as yet we have done little to ensure that there won’t be another.

Continue Reading

I highly reccommend reading Normal Accidents and The Logic of Failure: Recognizing And Avoiding Error In Complex Situations.

Adapt: Why Success Always Starts with Failure

I’m half-way through reading Tim Harford’s new book Adapt: Why Success Always Starts with Failure. So far, the book is brilliant.

Anyone who’s wondered why it’s so difficult to find leaders who can provide us with solutions to today’s problems should read this book.

We badly need to believe in the potency of leaders. Our instinctive response, when faced with a complicated challenge, is to look for a leader who will solve it…every president is elected after promising to change the way politics works; and almost every president then slumps in the polls as reality starts to bite. This isn’t because we keep electing the wrong leaders. It is because we have an inflated sense of what leadership can achieve in the modern world.

Perhaps we have this instinct because we evolved to operate in small hunter-gatherer groups, solving small hunter gatherer problems…. The challenges society faced, however formidable, were simple enough to have been solved by an intelligent, wise, brave leader.

Harford argues this is not a way to solve today’s problems because the world has become mind-bogglingly complicated.

The days of top-down design are coming to a close. Adapt walks us through how any problem —big or small— really gets solved in a world where, even something as simple as building a toaster, is too complex for one man to do on his own.

The toasting problem, after all isn’t difficult:

don’t burn the toast; don’t electrocute the user; don’t start a fire. The bread itself is hardly an active protagonist. It doesn’t deliberately try to outwit you, as a team of investment bankers might; it doesn’t try to murder you, terrorise your country, and discredit everything you stand for…The toasting problem is laughably simple compared to the problem of transforming a poor country such as Bangladesh into the kind of economy where toasters are manufactured with ease and every household can afford one, along with the bread to put into it. It is dwarfed by the problem of climate change – the response to which will require much more than modifying a billion toasters.

The complexity of hunting for solutions in a situation where the challenges never stop shifting are the problems of this book. Harford makes a compelling argument for embracing risk, failure, and experimentation rather than top-down solutions to solve today’s complex problems.

You can buy the book here.

Here is Tim introducing the book:

Taleb: The Fooled by Randomness Effect and the Internet Diet?

In this brief article Nassim Taleb (of Black Swan fame) touches on information, complexity, the randomness effect, over-confidence, and signal and noise.

THE DEGRADATION OF PREDICTABILITY — AND KNOWLEDGE

I used to think that the problem of information is that it turns homo sapiens into fools — we gain disproportionately in confidence, particularly in domains where information is wrapped in a high degree of noise (say, epidemiology, genetics, economics, etc.). So we end up thinking that we know more than we do, which, in economic life, causes foolish risk taking. When I started trading, I went on a news diet and I saw things with more clarity. I also saw how people built too many theories based on sterile news, the fooled by randomness effect. But things are a lot worse. Now I think that, in addition, the supply and spread of information turns the world into Extremistan (a world I describe as one in which random variables are dominated by extremes, with Black Swans playing a large role in them). The Internet, by spreading information, causes an increase in interdependence, the exacerbation of fads (bestsellers like Harry Potter and runs on the banks become planetary). Such world is more “complex”, more moody, much less predictable.

So consider the explosive situation: more information (particularly thanks to the Internet) causes more confidence and illusions of knowledge while degrading predictability.

Look at this current economic crisis that started in 2008: there are about a million persons on the planet who identify themselves in the field of economics. Yet just a handful realized the possibility and depth of what could have taken place and protected themselves from the consequences. At no time in the history of mankind have we lived under so much ignorance (easily measured in terms of forecast errors) coupled with so much intellectual hubris. At no point have we had central bankers missing elementary risk metrics, like debt levels, that even the Babylonians understood well.

I recently talked to a scholar of rare wisdom and erudition, Jon Elster, who upon exploring themes from social science, integrates insights from all authors in the corpus of the past 2500 years, from Cicero and Seneca, to Montaigne and Proust. He showed me how Seneca had a very sophisticated understanding of loss aversion. I felt guilty for the time I spent on the Internet. Upon getting home I found in my mail a volume of posthumous essays by bishop Pierre-Daniel Huet called Huetiana, put together by his admirers c. 1722. It is so saddening to realize that, being born close to four centuries after Huet, and having done most of my reading with material written after his death, I am not much more advanced in wisdom than he was — moderns at the upper end are no wiser than their equivalent among the ancients; if anything, much less refined.

So I am now on an Internet diet, in order to understand the world a bit better — and make another bet on horrendous mistakes by economic policy makers. I am not entirely deprived of the Internet; this is just a severe diet, with strict rationing. True, technologies are the greatest things in the world, but they have way too monstrous side effects — and ones rarely seen ahead of time. And since spending time in the silence of my library, with little informational pollution, I can feel harmony with my genes; I feel I am growing again.

Related: Noise Vs. Signal

Source