Blog

The Man In The Arena: Citizenship In A Republic

It is not the critic who counts: not the man who points out how the strong man stumbles or where the doer of deeds could have done better. The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood, who strives valiantly, who errs and comes up short again and again, because there is no effort without error or shortcoming, but who knows the great enthusiasms, the great devotions, who spends himself for a worthy cause; who, at the best, knows, in the end, the triumph of high achievement, and who, at the worst, if he fails, at least he fails while daring greatly, so that his place shall never be with those cold and timid souls who knew neither victory nor defeat.
Theodore Roosevelt1

There are those among us who dare to do more and in so doing draw attention to themselves. Sometimes they win, and sometimes they come up short but what they really enjoy is the fight — the striving to do better that’s needed to accomplish great things.

In contrast, most adults play it safe — standing on the sidelines watching others struggle to do more. As such, they know neither victory nor defeat — they only know how to comment on the struggle of others.

Remember Roosevelt’s oration the next time you criticize.

***

In Rising Strong, Brene Brown comments on Roosevelt’s speech, focusing on one particular part: “The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood.” She writes —

Imagine the sound of a needle scratching across a record. Stop here. Before I hear anything else about triumph or achievement, this is where I want to slow down time so I can figure out exactly what happens next.

We’re facedown in the arena. Maybe the crowd has gone silent, the way it does at football games or my daughter’s field hockey matches when the players on the field take a knee because someone is hurt. Or maybe people have started booing and jeering. Or maybe you have tunnel vision and all you can hear is your parent screaming, “Get up! Shake it off!”

Our “facedown” moments can be big ones like getting fired or finding out about an affair, or they can be small ones like learning a child has lied about her report card or experiencing a disappointment at work. Arenas always conjure up grandeur, but an arena is any moment when or place where we have risked showing up and being seen. Risking being awkward and goofy at a new exercise class is an arena. Leading a team at work is an arena. A tough parenting moment puts us in the arena. Being in love is definitely an arena.

When I started thinking about this research, I went to the data and asked myself, What happens when we’re facedown? What’s going on in this moment? What do the women and men who have successfully staggered to their feet and found the courage to try again have in common? What is the process of rising strong?

I wasn’t positive that slowing down time to capture the process was possible, but I was inspired by Sherlock Holmes to give it a shot. …

In Season 3, there’s an episode where Sherlock is shot. Don’t worry, I won’t say by whom or why, but, wow, I did not see it coming. The moment he’s shot, time stops. Rather than immediately falling, Sherlock goes into his “mind palace”—that crazy cognitive space where he retrieves memories from cerebral filing cabinets, plots car routes, and makes impossible connections between random facts. Over the next ten minutes or so, many of the cast of recurring characters appear in his mind, each one working in his or her area of expertise and talking him through the best way to stay alive.

First, the London coroner who has a terrific crush on Sherlock shows up. She shakes her head at Sherlock, who seems completely taken aback by his inability to make sense of what’s happening, and comments, “It’s not like it is in the movies, is it, Sherlock?” Aided by a member of the forensics team at New Scotland Yard and Sherlock’s menacing brother, she explains the physics of how he should fall, how shock works, and what he can do to keep himself conscious. The three warn him when pain is coming and what he can expect. What probably takes three seconds in real time plays out for more than ten minutes on the screen. I thought the writing was genius, and it re-energized my efforts to keep at my own slow-motion project.

My goal for this book is to slow down the falling and rising processes: to bring into our awareness all the choices that unfurl in front of us during those moments of discomfort and hurt, and to explore the consequences of those choices.

[…]

On a cultural level, I think the absence of honest conversation about the hard work that takes us from lying facedown in the arena to rising strong has led to two dangerous outcomes: the propensity to gold-plate grit and a badassery deficit.

Footnotes

Taleb: The Risk Externalities of Too Big to Fail

Too Big to Fail” is a dilemma that has plagued economists, policy makers and the public at large. In Nassim Taleb’s lastest paper (with co-author Charles S. Tapiero) he takes a look.

Abstract

This paper examines the risk externalities stemming from the size of institutions. Assuming (conservatively) that a firm risk exposure is limited to its capital while its external (and random) losses are unbounded we establish a condition for a firm to be too big to fail. In particular, expected risk externalities’ losses conditions for positive first and second derivatives with respect to the firm capital are derived. Examples and analytical results are obtained based on firms’ random effects on their external losses (their risk externalities) and policy implications are drawn that assess both the effects of “too big to fail firms” and their regulation.

The conclusion is worth reading even if you don’t read the paper — a small tease

However, the non- transparent bonuses that CEOs of large banks apply to themselves while not a factor in banks failure is a violation of the trust signaled by the incentives that banks have created to maintain the payments they distribute to themselves. For these reasons, too big too fail banks may entail too large too bear risk externalities. The result we have obtained indicate that this is a fact when banks internal risks have an extreme probability distribution (as this is often the case in VaR studies) and when external risks are an unbounded Pareto distribution.

Paper

Niccolò Machiavelli and the Four Princes of Pragmatism

To top off the course The Moral Leader, Professor Badaracco’s students dissect Niccolo Machiavelli’s chilling classic The Prince.

“You may think that’s an odd place to end what is essentially a business ethics elective,” Badaracco acknowledged with a smile.

Students talk about what Machiavelli has to say on one crucial key to leadership: leading in the world as it is.

Four different takes on The Prince usually emerge in their discussion—though there are at least a hundred different readings of Machiavelli for scholars who truly delve into the literature, Badaracco points out.

Version 1: “This book is a mess. It was written by a guy who hoped to get to the center of things, was there briefly, offended some of the wrong Medicis, was exiled, was tortured, and wanted to get back in.” It’s “a scholar’s dream because you can find anything you want in it and play intellectual games. But just put it aside.”

Version 2: “Now wait a minute. There’s some good common sense in there. Machiavelli is basically saying that if you want to make an omelet you have to break some eggs. … To do some right things, you may have to not do some other right things.”

Version 3: Other students believe the book is still around because it’s so evil. Why is it evil? “If you look closely at The Prince,” he said, “it’s quite interesting what isn’t in the book. Nothing about religion. Nothing about the Church. Nothing about God. There’s nothing about spirituality. Almost nothing about the law. Almost nothing about traditions. You’re out there on your own doing what works for you in terms of naked ambition.”

Version 4: A fourth Prince that other students uncover is the most interesting one, in Badaracco’s mind. Students find that the book reveals a kind of worldview, he says, and it’s not an evil worldview. This version goes: “If you’re going to make progress in the world you’ve got to have a clear sense, a realistic sense, an unsentimental sense, of how things really work: the mixed motives that compel some people and the high motives that compel some others. And the low motives that unfortunately captivate other people.”

Students who claim the fourth Prince, he said, believe that if they’re going to make a difference, it’s got to be in that world, “not in some ideal world that you would really like to live in.”
Link

How Con Artists Exploit Human Behaviour

Fascinating (read the PDF! summary below)

The seven principles of human behaviour that con artists exploit, according to the article:

 

  1. The distraction principle: While you are distracted by what retains your interest, hustlers can do anything to you and you won’t notice.
  2. The social compliance principle: Society trains people not to question authority. Hustlers exploit this “suspension of suspiciousness” to make you do what they want.
  3. The herd principle: Even suspicious marks will let their guard down when everyone next to them appears to share the same risks. Safety in numbers? Not if they’re all conspiring against you.
  4. The dishonesty principle: Anything illegal you do will be used against you by the fraudster, making it harder for you to seek help once you realize you’ve been had.
  5. The deception principle: Things and people are not what they seem. Hustlers know how to manipulate you to make you believe that they are.
  6. The need and greed principle: Your needs and desires make you vulnerable. Once hustlers know what you really want, they can easily manipulate you.
  7. The Time principle: When you are under time pressure to make an important choice, you use a different decision strategy. Hustlers steer you towards a strategy involving less reasoning.

See the PDF

The High Cost of Distractions

We tend to think that other people get distracted but not us. We’re different. We’re better than average. We can do more than one thing at a time and still be amazing.

Not so.

The always-on world of 24/7 bits and bytes is leaving an impact. While we cling to the illusion that we’re more productive, in reality, we’re not. Distractions eat time. And more importantly they create an environment where we shallow think.

Here is an excerpt from Your Brain at Work: Strategies for Overcoming Distraction, Regaining Focus, and Working Smarter All Day Long, where author David Rock discusses this in more detail.

Distractions are everywhere. And with the always-on technologies of today, they take a heavy toll on productivity. One study found that office distractions eat an average 2.1 hours a day. Another study, published in October 2005, found that employees spent an average of 11 minutes on a project before being distracted. After an interruption it takes them 25 minutes to return to the original task, if they do at all. People switch activities every three minutes, either making a call, speaking with someone in their cubicle, or working on a document.

But that’s not all. Distractions are impacting our ability to focus. And focus is how we use second-order thinking. Rock writes:

Distractions are not just frustrating; they can be exhausting. By the time you get back to where you were, your ability to stay focused goes down even further as you have even less glucose available now. Change focus ten times an hour (one study showed people in offices did so as much as 20 times an hour), and your productive thinking time is only a fraction of what’s possible. Less energy equals less capacity to understand, decide, recall, memorize, and inhibit. The result could be mistakes on important tasks. Or distractions can cause you to forget good ideas and lose valuable insights. Having a great idea and not being able to remember it can be frustrating, like an itch you can’t scratch, yet another distraction to manage.

Maybe open-plan offices are not such a good idea after all. Not only do we do more work, but we do our best work when we’re distraction free.

***

If you enjoyed this article you’ll also like:

In Praise of Slowness: Challenging the Cult of Speed — This article explores our cultural desire for speed and its consequences. Slow, it turns out, is the best way to increase understanding and avoid problems.

How to Survive in an Open Office — The author of Quiet: The Power of Introverts in a World That Can’t Stop Talking, Susan Cain, offers advice on how to survive in an open office.

 

18 Truths: The Long Fail of Complexity

The Eighteen Truths

The first few items explain that catastrophic failure only occurs when multiple components break down simultaneously:

1. Complex systems are intrinsically hazardous systems.

The frequency of hazard exposure can sometimes be changed but the processes involved in the system are themselves intrinsically and irreducibly hazardous. It is the presence of these hazards that drives the creation of defenses against hazard that characterize these systems.

2. Complex systems are heavily and successfully defended against failure.

The high consequences of failure lead over time to the construction of multiple layers of defense against failure. The effect of these measures is to provide a series of shields that normally divert operations away from accidents.

3. Catastrophe requires multiple failures – single point failures are not enough.

Overt catastrophic failure occurs when small, apparently innocuous failures join to create opportunity for a systemic accident. Each of these small failures is necessary to cause catastrophe but only the combination is sufficient to permit failure.

4. Complex systems contain changing mixtures of failures latent within them.

The complexity of these systems makes it impossible for them to run without multiple flaws being present. Because these are individually insufficient to cause failure they are regarded as minor factors during operations.

5. Complex systems run in degraded mode.

A corollary to the preceding point is that complex systems run as broken systems. The system continues to function because it contains so many redundancies and because people can make it function, despite the presence of many flaws.

Point six is important because it clearly states that the potential for failure is inherent in complex systems. For large-scale enterprise systems, the profound implications mean that system planners must accept the potential for failure and build in safeguards. Sounds obvious, but too often we ignore this reality:

6. Catastrophe is always just around the corner.

The potential for catastrophic outcome is a hallmark of complex systems. It is impossible to eliminate the potential for such catastrophic failure; the potential for such failure is always present by the system’s own nature.

Given the inherent potential for failure, the next point describes the difficulty in assigning simple blame when something goes wrong. For analytic convenience (or laziness), we may prefer to distill narrow causes for failure, but that can lead to incorrect conclusions:

7. Post-accident attribution accident to a ‘root cause’ is fundamentally wrong.

Because overt failure requires multiple faults, there is no isolated ’cause’ of an accident. There are multiple contributors to accidents. Each of these is necessary insufficient in itself to create an accident. Only jointly are these causes sufficient to create an accident.

The next group goes beyond the nature of complex systems and discusses the all-important human element in causing failure:

8. Hindsight biases post-accident assessments of human performance.

Knowledge of the outcome makes it seem that events leading to the outcome should have appeared more salient to practitioners at the time than was actually the case. Hindsight bias remains the primary obstacle to accident investigation, especially when expert human performance is involved.

9. Human operators have dual roles: as producers & as defenders against failure.

The system practitioners operate the system in order to produce its desired product and also work to forestall accidents. This dynamic quality of system operation, the balancing of demands for production against the possibility of incipient failure is unavoidable.

10. All practitioner actions are gambles.

After accidents, the overt failure often appears to have been inevitable and the practitioner’s actions as blunders or deliberate willful disregard of certain impending failure. But all practitioner actions are actually gambles, that is, acts that take place in the face of uncertain outcomes. That practitioner actions are gambles appears clear after accidents; in general, post hoc analysis regards these gambles as poor ones. But the converse: that successful outcomes are also the result of gambles; is not widely appreciated.

11. Actions at the sharp end resolve all ambiguity.

Organizations are ambiguous, often intentionally, about the relationship between production targets, efficient use of resources, economy and costs of operations, and acceptable risks of low and high consequence accidents. All ambiguity is resolved by actions of practitioners at the sharp end of the system. After an accident, practitioner actions may be regarded as ‘errors’ or ‘violations’ but these evaluations are heavily biased by hindsight and ignore the other driving forces, especially production pressure.

Starting with the nature of complex systems and then discussing the human element, the paper argues that sensitivity to preventing failure must be built in ongoing operations.

In my experience, this is true and has substantial implications for the organizational culture of project teams:

12. Human practitioners are the adaptable element of complex systems.

Practitioners and first line management actively adapt the system to maximize production and minimize accidents. These adaptations often occur on a moment by moment basis.

13. Human expertise in complex systems is constantly changing

.

Complex systems require substantial human expertise in their operation and management. Critical issues related to expertise arise from (1) the need to use scarce expertise as a resource for the most difficult or demanding production needs and (2) the need to develop expertise for future use.

14. Change introduces new forms of failure.

The low rate of overt accidents in reliable systems may encourage changes, especially the use of new technology, to decrease the number of low consequence but high frequency failures. These changes maybe actually create opportunities for new, low frequency but high consequence failures. Because these new, high consequence accidents occur at a low rate, multiple system changes may occur before an accident, making it hard to see the contribution of technology to the failure.

15. Views of ’cause’ limit the effectiveness of defenses against future events.

Post-accident remedies for “human error” are usually predicated on obstructing activities that can “cause” accidents. These end-of-the-chain measures do little to reduce the likelihood of further accidents.

16. Safety is a characteristic of systems and not of their components

Safety is an emergent property of systems; it does not reside in a person, device or department of an organization or system. Safety cannot be purchased or manufactured; it is not a feature that is separate from the other components of the system. The state of safety in any system is always dynamic; continuous systemic change insures that hazard and its management are constantly changing.

17. People continuously create safety.

Failure free operations are the result of activities of people who work to keep the system within the boundaries of tolerable performance. These activities are, for the most part, part of normal operations and superficially straightforward. But because system operations are never trouble free, human practitioner adaptations to changing conditions actually create safety from moment to moment.

The paper concludes with a ray of hope to those have been through the wars:

18. Failure free operations require experience with failure.

Recognizing hazard and successfully manipulating system operations to remain inside the tolerable performance boundaries requires intimate contact with failure. More robust system performance is likely to arise in systems where operators can discern the “edge of the envelope”. It also depends on providing calibration about how their actions move system performance towards or away from the edge of the envelope.

Source