Random Posts

The (Really) Invisible Gorilla

Inattentional blindness is the phenomenon of not being able to see things that are actually there. This concept was popularized in 2010 book The Invisible Gorilla: How Our Intuitions Deceive Us by Daniel Simons and Christopher Chabris.

The best known experiment demonstrating inattentional blindness involves a basketball. Simons and Chabris gave subjects a very simple task: try and keep track of how many times the basketball was passed between players. While the ball was being tossed someone in a gorilla suit walks across the screen, in plain view. What they discovered was unexpected. Few people noticed the gorilla because they were so focused on counting the passes. When we pay close attention to one thing, we fail to notice other things.

A recent article by Wray Herbert covers some new research that this perceptual quirk can have serious, even life-threatening, implications.

Three psychological scientists at Brigham and Women’s Hospital in Boston –Trafton Drew, Melissa Vo and Jeremy Wolfe — wondered if expert observers are also subject to this perceptual blindness. The subjects in the classic study were “naïve” — untrained in any particular domain of expertise and performing a task nobody does in real life. But what about highly trained professionals who make their living doing specialized kinds of observations? The scientists set out to explore this, and in an area of great importance to many people — cancer diagnosis.

Radiologists are physicians with special advanced training in reading various pictures of the body — not just the one-shot X-rays of the past but complex MRI, CT and PET scans as well. In looking for signs of lung cancer, for example, radiologists examine hundreds of ultra-thin CT images of a single patient’s lungs, looking for tiny white nodules that warn of cancer. It’s these expert observers that the Brigham and Women’s scientists chose to study.

They recruited 24 experienced and credentialed radiologists — and a comparable group of naïve volunteers. They tracked their eye movements as they examined five patients’ CT scans, each made up of hundreds of images of lung tissue. Each case had about ten nodules hiding somewhere in the scans, and the radiologists were instructed to click on these nodules with a mouse. On the final case, the scientists inserted a tiny image of a gorilla (an homage to the original work) into the lung. They wanted to see if the radiologists, focused on the telltale nodules, would be blind to the easily detectable and highly anomalous gorilla.

The gorilla was miniscule, but huge compared to the nodules. It was about the size of a box of matches — or 48 times the size of a typical nodule. It faded in and out — becoming more, then less opaque — over a sequence of five images. There was no mistaking the gorilla: If someone pointed it out on the lung scan and asked, What is that? — everyone would answer: That’s a gorilla.

After they were done scrolling through the images as much as they wanted, the scientists asked them: Did that last trial seem any different? Did you notice anything unusual on the final trial? And finally: Did you see a gorilla on the final trial? Twenty of the 24 radiologists failed to see the gorilla, despite scrolling past it more than four times on average. And this was not because it was difficult to see: When shown the image again after the experiment, all of them saw the gorilla. What’s more, the eye-tracking data showed clearly that most of those who did not see the gorilla did in fact look right at it.

The Learning Paradox: Why Struggling to Learn is a Good Thing

The more you struggle to master new information, the better you’ll understand and apply it later.

Annie Murphy Paul explores in Time:

The learning paradox is at the heart of “productive failure,” a phenomenon identified by Manu Kapur, a researcher at the Learning Sciences Lab at the National Institute of Education of Singapore. Kapur points out that while the model adopted by many teachers and employers when introducing others to new knowledge — providing lots of structure and guidance early on, until the students or workers show that they can do it on their own — makes intuitive sense, it may not be the best way to promote learning. Rather, it’s better to let the neophytes wrestle with the material on their own for a while, refraining from giving them any assistance at the start

Earlier this year, in a paper published in the Journal of the Learning Sciences, Kapur applied the principle of productive failure to mathematical problem-solving.

With one group of students, the teacher provided strong “scaffolding” — instructional support — and feedback. With the teacher’s help, these pupils were able to find the answers to their set of problems. Meanwhile, a second group was directed to solve the same problems by collaborating with one another, absent any prompts from their instructor. These students weren’t able to complete the problems correctly. But in the course of trying to do so, they generated a lot of ideas about the nature of the problems and about what potential solutions would look like. And when the two groups were tested on what they’d learned, the second group “significantly outperformed” the first.

The apparent struggles of the floundering group have what Kapur calls a “hidden efficacy”: they lead people to understand the deep structure of problems, not simply their correct solutions. When these students encounter a new problem of the same type on a test, they’re able to transfer the knowledge they’ve gathered more effectively than those who were the passive recipients of someone else’s expertise.

Kapur argues we need to “design for productive failure” by building it into the learning process.

In the process of his work he’s identified three conditions that promote a beneficial struggle:

1. Choose problems that “challenge but do not frustrate.”
2. Allow students to explain and elaborate on what they’re doing.
3. Compare and contrast both good and bad solutions to the problems.

Still curious? Use the Feynman technique to learn anything better and faster.

Dan Gilbert: Why do we make decisions our future selves regret?

“Human beings are works in progress that mistakenly think they’re finished.”

In the 7-minute TED talk (below), Harvard psychologist Dan Gilbert illuminates some recent research on a phenomenon he calls the “end of history illusion,” where we imagine that the person we are today is the person we’ll be until we die. But that’s not the case.

The bottom line is, time is a powerful force. It transforms our preferences. It reshapes our values. It alters our personalities. We seem to appreciate this fact, but only in retrospect. Only when we look backwards do we realize how much change happens in a decade. It’s as if, for most of us, the present is a magic time. It’s a watershed on the timeline. It’s the moment at which we finally become ourselves. Human beings are works in progress that mistakenly think they’re finished. The person you are right now is as transient, as fleeting and as temporary as all the people you’ve ever been. The one constant in our life is change.

Still Curious? He further develops the concept more in his book Stumbling on Happiness.

The Power of Your Subconscious Mind

We think that we’re in control. We believe that our conscious mind directs our thoughts and somehow controls our subconscious mind. We’re wrong.

In Richard Restak’s The Brain Has a Mind of Its Own:

At the moment of decision we all feel we are acting freely, selecting at will from an infinity of choices. Yet research suggests this sense of freedom may be merely an illusory by-product of the way the human brain operates.

Restak gives the example of reading this essay. You scan the title and a few sentences here and there and eventually make a decision to stop reading or read on. You might then go back to the beginning and start reading, or you might start reading wherever it was in the article when you decided to stop skimming.

“The internal sequence,” Restak writes, “was always thought to be: 1. you make a conscious decision to read; 2. that decision triggers your brain into action; 3. your brain then signals the hands to stop turning pages, focuses the eyes on the paragraph, and so on.”

But this isn’t what happens at all. “An inexplicable but plainly measurable burst of activity occurs in your brain prior to your conscious desire to act.”

The subconscious mind controls a lot of what we think and the connections we make. And, of course, our thoughts influence what we do.

In The Thinker’s Toolkit, Morgan Jones recalls the story found in David Kahn’s The Codebreakers.

Breaking codes in World War II was perhaps the largest big data project ever to happen in the world up until that point. The conscious mind could only do so much. One German cryptanalyst recalled, “You must concentrate almost in a nervous trace when working on a code. It is not often done by conscious effort. The solution often seems to crop up from the subconscious.”

Believing that the conscious mind calls the shots prevents us from understanding ourselves, others, and how to make better decisions to name but a few things.

In Plain Talk, Ken Iverson offers some insight on how to turn these thoughts into practical utility.

“Every manager,” he writes “should be something of a psychologist—what makes people tick, what they want, what they need. And much of what people want and need resides in the subconscious. The job of a manager is to help the people accomplish extraordinary things. And that means shaping a work environment that stimulates people to explore their own potential.”

We place too much emphasis on the conscious mind and not enough on the subconscious one.

Unless you manage your environment, it will manage you. The old question ‘would you rather be the poorest in a wealthy neighborhood or the richest in a poor neighborhood?’ is based on how the environment controls our subconscious and our subconscious controls our happiness.

The Science of Obesity

One thing that has always baffled me is how we get fat.

Why We Get Fat by Gary Taubes unearths the biological truth around why we’re getting fat. In the process, Taubes dispels many accepted ideas on weight-loss and nutrition.

While it’s easy to believe that we remain lean because we’re virtuous and we get fat because we lack self-control or discipline, the evidence clearly says otherwise. Taubes methodically tackles conventional (and governmental) wisdom and why it is wrong.

This is a biology book, not a diet book. It’s about the science of what’s happening in our body that makes us fat. Let’s explore Taubes argument.

Is this a simple calories-in calories-out problem?

Do low-calorie diets work? In the short-term yes but overall, no.

“The two researchers who may have had the best track record in the world treating obesity in an academic setting are George Blackburn and Bruce Bistrian of Harvard Medical School. In the 1970s, they began treating obese patients with a six-hundred-calorie-a-day diet of only lean meat, fish, and fowl. They treated thousands of patients, said Bistrian. Half of them lost more than forty pounds.”

They concluded, “This is an extraordinarily effective and safe way to get large amounts of weight loss.” Yet, shortly after, Taubes says “Bistrian and Blackburn gave up on the therapy because they didn’t know what to tell their patients to do after the weight was lost. The patients couldn’t be expected to live on six hundred calories a day forever, and if they returned to eating normally, they’d gain all the weight back.”

So, even if you lose weight on a low-calorie diet, you’re stuck with the what now problem.

What if i just exercise more?

What happens when we increase our energy expenditure by upping our physical activity? Taubes says “Considering the ubiquity of the message, the hold it has on our lives, and the elegant simplicity of the notion-burn calories, lose weight, prevent disease-wouldn’t it be nice if it were true?”

Alas, believing doesn’t make it so. While there are many reasons to exercise regularly, losing weight isn’t one of them.

Taubes looks at the evidence and walks us through a chain of reasoning. The evidence says obesity associates with poverty. In most modern parts of the world, the poorer people are, the fatter they are likely to be. Yet, it’s the poor and disadvantaged who sweat out a living with physical labor. This is one of the reasons to doubt the assertion that expending a large amount of energy on a regular basis makes us fat.

Another reason to doubt the calorie-out hypothesis is the obesity epidemic itself. We’ve been getting fatter for the past few decades which suggests that we’re getting more sedentary. Until the 1970s, that is, before the obesity problem, Americans were not believers in the need to spend leisure time sweating.

In addition, it turns out there is very little hard evidence to support the belief that the number of calories we burn has any meaningful impact on how fat we become. The American Heart Association even calls the data supporting this claim “not particularly compelling.”

A study by Paul Williams and Peter Wood collected detailed information on almost 13k runners and then compared the weekly mileage with how much they weighed year-to-year. As you would expect, those who ran the most tended to weigh the least, but, perhaps unexpectedly, all these runners tended to get fatter with each passing year (even those running more than 40miles a week!)

According to Taubes, the belief in exercising more to weigh less is “based ultimately on one observation and one assumption. The observation is that people who are lean tend to be more physically active than those of us who aren’t. This is undisputed. … But this observation tells us nothing about whether runners would be fatter if they didn’t run or if the pursuit of distance running as full-time hobby will turn a fat man or woman into a lean marathoner. We base our belief in the fat-burning properties of exercise on the assumption that we can increase our energy expenditure (calories-out) without being compelled to increase our energy intake (calories-in).”

This assumption is wrong. We ended up buying into this exercise-more-eat-less story because it feels intuitive, correct, and reinforces our beliefs. We didn’t ask for evidence and none has been forthcoming in the intervening years.

Is it a matter of balancing calories?

No. Weight gain is a gradual process. So once you notice your jeans are getting tight, you can make some smart decisions and cut calories and increase physical activity right? “If it were true that our adiposity is determined by calories-in/calories-out, then this is one implication: you only need to overeat, on average, by twenty calories a day to gain fifty extra pounds in 20 years.” Now think of all the food decisions you make in a day and how impossible it would be, without scientific instrumentation, to balance your food.

Thermodynamics

Wait, what about thermodynamics. The law that says energy can be transformed from one form to another but not created nor destroyed.

“The very notion that we get fat because we consume more calories than we expend would not exist without the misapplied belief that the laws of thermodynamics make it true. When experts write that obesity is a disorder of energy balance—a declaration that can be found in one form or another in much of the technical writing on the subject—it is shorthand for saying that the laws of thermodynamics dictate this to be true. And yet they don’t.

All the first law of thermodynamics says is that “if something gets more or less massive, then more energy or less energy has to enter it than leave it. It says nothing about why this happens. It says nothing about cause and effect. It doesn’t tell us why anything happens.”

Experts think the first law is relevant because it fits neatly with our existing theories about why we get fact—those who consume more calories than they burn will gain weight. Thermodynamics tells us that if we get fatter and heavier, more energy enters our body than leaves it. But the important question, at least from an obesity perspective, is why do we consume more calories than we expend?

One of the other problems with thermodynamics argument is the assumption that the energy we consume and the energy we exert have little influence on each other—that we can change one without impacting the other.

The literature says that animals whose food is suddenly restricted tend to reduce energy expenditure both by being less active and by slowing energy use in cells, thereby limiting weight loss. They also experience hunger so that once the restriction ends, they will eat more than their prior norm until the earlier weight is obtained. (This is the same problem Bistrian and Blackburn encountered earlier).

Another problem with Thermodynamics is that it doesn’t address why men and women fatten differently. This means, at least at some level, bodily functions and possibly genetics play a role.

When we believe, as we do, that people get fat because they overeat, we’re putting the ultimate blame on a weakness of character and leaving biology out of it. This implies that we can generally tell, just by looking at the waistline, which people have strong self-control.

Adiposity

In the early 1970s, George Wade studied the relationship between sex hormones, weight, and appetite by removing the ovaries from rats. The impact was dramatic: the previously skinny rats ate voraciously and became obese. “The rat eats too much, the excess calories find their way to the fat tissue, and the animal becomes obese,” offers Taubes. He continues, “this would confirm our preconception that overeating is responsible for obesity in humans as well. But Wade did a revealing second experiment, removing the ovaries from the rats and putting them on a strict postsurgical diet. Even if these rats were ravenously hungry after the surgery, even if they desperately wanted to be gluttons, they couldn’t satisfy their urge.” The rats still got just as fat, just as quickly. And that is the start of our understanding of why we actually get fat.

The animal doesn’t get fat because it overeats, it overeats because it’s getting fat. The animal is unable to regulate its fat tissue.

A follow-on experiment, where the rats were injected with estrogen after the surgery, resulted in normal behavior. That is, they did not become slothful or obese. Biologically, one of the things that estrogen does is to influence an enzyme called lipoprotein lipase (LPL). When cells want fat they express their interest by “expressing” LPL. If the LPL comes from a fat cell, we get fatter. If the LPL comes from a muscle cell, it gets pulled in and digested as fuel. LPL, according to Williams Textbook of Endocrinology, “is a key factor in partitioning triglycerides (i.e., fat) among different body tissues.”

One of Estrogen’s roles is to inhibit the activity of LPL “expressed” by fat cells. The rats in Wade’s experiments over-ate because they were losing calories into fat cells that were needed in other places. The fatter the rat got, the more it had to eat to feed the non-fat cells. When the body is unregulated, it creates a cycle of getting fatter and fatter.

This, as Taubes says, “reverses our perception of the cause and effect of obesity. It tells us that two behaviors—gluttony and sloth—that seem to be the reasons we get fat can in fact be the effects of getting fat.” It also tells us that influencing LPL (either positively or negatively) has a dramatic effect on how fat we get.

LPL also explains why men and women get fat in different spots and why exercise doesn’t work. In men, LPL, activity is higher in the gut and lower below the waist. In women, LPL is highest below the waist. Bad news though, after menopause, LPL in a woman’s abdomen catches up to the men. As for exercise, while we’re working out LPL activity decreases on our fat cells and increases on muscle cells—so far, so good—because this prompts the release of fat from our fat tissue so that muscles can use this as energy. When we stop exercising, however, the situation reverses. LPL activity on the muscle cells shuts down and LPL activity on fat cells picks up. The fat cells natural tendency is to get back to their previous state.

So what regulates all of this?

Insulin. The LPL on fat cells is regulated by the presence of insulin. The more insulin our body secretes, the more active the LPL becomes on the fat cells, and the more fat that, rather than being consumed as fuel by the muscle cells, gets stored in fat cells. As if designed to ensure we get fatter, insulin also reduces the LPL expressed by the muscle cells (to ensure there is lots of fat floating around for the fat cells). That is, it tells the muscle cells not to burn fat as a fuel.

Insulin also influences an enzyme called hormone-sensitive lipase, or HSL. And this says Taubes, “may be even more critical to how insulin regulates the amount of fat we store. Just as LPL works to make fat cells (and us) fatter, HSL works to make fat cells (and us) leaner. It does so by working inside the fat cells to break down triglycerides into their component fatty acids so that those fatty acids can then escape into the circulation. The more active this HSL, the more fat we liberate and can burn from fuel and the less, obviously, we store. Insulin also suppresses this enzyme HSL and so it prevents triglycerides from being broken down inside the fat cells to a minimum.” This also helps explain why diabetics often get fatter when they take insulin therapy.

Carbohydrates primarily determine the insulin level in the blood. Here quantity and quality are important. Carbs ultimately determine how fat we get. But most people eat carbs so why are some fatter than others? We all naturally secrete a different level of insulin — given the same food people will secrete different levels of insulin. Another factor is how sensitive your cells are to insulin and how quickly they become insensitive. The more insulin you secrete—naturally or with carbohydrate rich foods—the more likely it is that your body becomes insulin resistant. The result is a vicious circle.

Not all foods containing carbs are equally fattening. The most fattening foods are those that have the greatest impact on our insulin and blood sugar levels. These are the easily digestible carbs. Anything made of refined flour (bread, cereals, and pasta), starches (potatoes, rice, and corn), and liquids (beer, pop, fruit juice). “These foods,” says Taubes, “flood the bloodstream quickly with glucose. Blood sugar shoots up; insulin shoots up; We get fatter.”

Here is Taubes in a 70-minute video explaining more.

If you want to learn more about the science behind why we get fat, I recommend brushing up on your biology a little and reading Why We Get Fat. Taubes also wrote Good Calories Bad Calories.

A Cascade of Sand: Complex Systems in a Complex Time

We live in a world filled with rapid change: governments topple, people rise and fall, and technology has created a connectedness the world has never experienced before. Joshua Cooper Ramo believes this environment has created an “‘avalanche of ceaseless change.”

In his book, The Age of the Unthinkable: Why the New World Disorder Constantly Surprises Us And What We Can Do About It he outlines what this new world looks like and gives us prescriptions on how best to deal with the disorder around us.

Ramo believes that we are entering a revolutionary age that will render seemingly fortified institutions weak, and weak movements strong. He feels we aren’t well prepared for these radical shifts as those in positions of power tend to have antiquated ideologies in dealing with issues. Generally, they treat anything complex as one dimensional.

Unfortunately, whether they are running corporations or foreign ministries or central banks, some of the best minds of our era are still in thrall to an older way of seeing and thinking. They are making repeated misjudgments about the world. In a way, it’s hard to blame them. Mostly they grew up at a time when the global order could largely be understood in simpler terms, when only nations really mattered, when you could think there was a predictable relationship between what you wanted and what you got. They came of age as part of a tradition that believed all international crises had beginnings and, if managed well, ends.

This is one of the main flaws of traditional thinking about managing conflict/change: we identify a problem, decide on a path forward, and implement that solution. We think in linear terms and see a finish line once the specific problem we have discovered is ‘solved.’

In this day and age (and probably in all days and ages, whether they realized it or not) we have to accept that the finish line is constantly moving and that, in fact, there never will be a finish line. Solving one problem may fix an issue for a time but it tends to also illuminate a litany of new problems. (Many of which were likely already present but hiding under the old problem you just “fixed”.)

In fact, our actions in trying to solve X will sometimes have a cascade effect because the world is actually a series of complex and interconnected systems.

Some great thinkers have spoken about these problems in the past. Ramo highlights some interesting quotes from the Nobel Prize speech that Austrian economist Friedrich August von Hayek gave in 1974, entitled The Pretence of Knowledge.

To treat complex phenomena as if they were simple, to pretend that you could hold the unknowable in the cleverly crafted structure of your ideas —he could think of nothing that was more dangerous. “There is much reason,” Hayek said, “to be apprehensive about the long-run dangers created in a much wider field by the uncritical acceptance of assertions which have the appearance of being scientific.”

Concluding his Nobel speech, Hayek warned, “If man is not to do more harm than good in his efforts to improve the social order, he will have to learn that in this, as in all other fields where essential complexity of an organized kind prevails, he cannot acquire the full knowledge which would make mastery of the events possible.” Politicians and thinkers would be wise not to try to bend history as “the craftsman shapes his handiwork, but rather to cultivate growth by providing the appropriate environment, in the manner a gardener does for his plants.”

This is an important distinction: the idea that we need to be gardeners instead of craftsmen. When we are merely creating something we have a sense of control; we have a plan and an end state. When the shelf is built, it’s built.

Being a gardener is different. You have to prepare the environment; you have to nurture the plants and know when to leave them alone. You have to make sure the environment is hospitable to everything you want to grow (different plants have different needs), and after the harvest you aren’t done. You need to turn the earth and, in essence, start again. There is no end state if you want something to grow.

* * *

So, if most of the threats we face to today are so multifaceted and complex that we can’t use the majority of the strategies that have worked historically, how do we approach the problem? A Danish theoretical physicist named Per Bak had an interesting view of this which he termed self-organized criticality and it comes with an excellent experiment/metaphor that helps to explain the concept.

Bak’s research focused on answering the following question: if you created a cone of sand grain by grain, at what point would you create a little sand avalanche? This breakdown of the cone was inevitable but he wanted to know if he could somehow predict at what point this would happen.

Much like there is a precise temperature that water starts to boil, Bak hypothesized there was a specific point where the stack became unstable, and at this point adding a single grain of sand could trigger the avalanche.

In his work, Bak came to realize that the sandpile was inherently unpredictable. He discovered that there were times, even when the pile had reached a critical state, that an additional grain of sand would have no effect:

“Complex behavior in nature,” Bak explained, “reflects the tendency of large systems to evolve into a poised ‘critical’ state, way out of balance, where minor disturbances may lead to events, called avalanches, of all sizes.” What Bak was trying to study wasn’t simply stacks of sand, but rather the underlying physics of the world. And this was where the sandpile got interesting. He believed that sandpile energy, the energy of systems constantly poised on the edge of unpredictable change, was one of the fundamental forces of nature. He saw it everywhere, from physics (in the way tiny particles amassed and released energy) to the weather (in the assembly of clouds and the hard-to-predict onset of rainstorms) to biology (in the stutter-step evolution of mammals). Bak’s sandpile universe was violent —and history-making. It wasn’t that he didn’t see stability in the world, but that he saw stability as a passing phase, as a pause in a system of incredible —and unmappable —dynamism. Bak’s world was like a constantly spinning revolver in a game of Russian roulette, one random trigger-pull away from explosion.

Traditionally our thinking is very linear and if we start thinking of systems as more like sandpiles, we start to shift into second-order thinking. This means we can no longer assume that a given action will produce a given reaction: it may or may not depending on the precise initial conditions.

This dynamic sandpile energy demands that we accept the basic unpredictability of the global order —one of those intellectual leaps that sounds simple but that immediately junks a great deal of traditional thinking. It also produces (or should produce) a profound psychological shift in what we can and can’t expect from the world. Constant surprise and new ideas? Yes. Stable political order, less complexity, the survival of institutions built for an older world? No.

Ramo isn’t arguing that complex systems are incomprehensible and fundamentally flawed. These systems are manageable, they just require a divergence from the old ways of thinking, the linear way that didn’t account for all the invisible connections in the sand.

Look at something like the Internet; it’s a perfect example of a complex system with a seemingly infinite amount of connections, but it thrives. This system is constantly bombarded with unsuspected risk, but it is so malleable that it has yet to feel the force of an avalanche. The Internet was designed to thrive in a hostile environment and its complexity was embraced. Unfortunately, for every adaptive system like the Internet, there seems to be a maladaptive system, ones so rigid they will surely break in a world of complexity.

The Age of the Unthinkable goes on to show us historical examples of systems that did indeed break; this helps to frame where we have been particularly fragile in the past and where the mistakes in our thinking may have been. In the back half of the book, Ramo outlines strategies he believes will help us become more Antifragile, he calls this “Deep Security”.

Implementing these strategies will likely be met with considerable resistance, many people in positions of power benefit from the systems staying as they are. Revolutions are never easy but, as we’ve shown, even one grain of sand can have a huge impact.

What I’ve been reading

Consumer.ology
I enjoyed the first part of the book, which explores the fallacy of market research and the complex reality about consumers and the psychology of shopping. A summary paragraph:

“The unconscious mind is the real driver of consumer behavior. Understanding consumers is largely a matter of understanding how the unconscious mind operates; the first obstacle to this is recognizing how we frequently react without conscious awareness. As long as we protect the illusion that we ourselves are primarily conscious agents, we pander to the belief that we can ask people what they think and trust what we hear in response. After all, we like to tell ourselves we know why we do what we do, so everyone else must be capable of doing the same, mustn’t they?”

***

Selfish Reasons to Have More Kids: Why Being a Great Parent is Less Work and More Fun Than You Think
Our sacrifices and fears stem from deep misconceptions about nature and nurture. According to economist Bryan Caplan, research shows that the long-run effect of parenting is surprisingly small. If you want to rationalize having kids or are interested in becoming a “free range parent” you might find this book interesting.

As you might expect from an economist, the book deals with a lot of research. An interesting example of this revolved around parenthood and age: parents under 30 are less happy than their child-free peers. However, once parents hit 40, the relationship reverses. More kids means happier parents too — that is, once you hit 40.

My favorite book in this category remains the Tiger Mom Book.

***

Kafka’s The Trial
The tale of Josef K., a responsible bank officer who is suddenly and inexplicably arrested and must defend himself against a charge about which he can get no information. To me this was a chilling tale on the excesses of bureaucracy. This book will resonate with anyone who deals with a large bureaucracy. (If you work in a large bureaucracy, in addition to this, you might also enjoy The Pale King.)

***

The True Believer: Thoughts on the Nature of Mass Movements
A great primer for anyone wishing to understand mass movements, be they religious movements, social revolutions or nationalist movements. A sample:

Discontent by itself does not invariably create a desire for change. Other factors have to be present before discontent turns into disaffection. One of these is a sense of power.

Those who are awed by their surroundings do not think of change, no matter how miserable their condition. When our mode of life is so precarious as to make it patent that we cannot control the circumstance of our existence, we tend to stick to the proven and the familiar. We counteract a deep feeling of insecurity by making of our existence a fixed routine.

and this telling passage:

For men to plunge headlong into an undertaking of vast change, they must be intensely discontented yet not destitute, and they must have the feeling that by the possession of some potent doctrine, infallible leader or some new technique they have access to a source of irresistible power. They must also have an extravagant conception of the prospects and potentialities of the future. Finally, they must be wholly ignorant of the difficulties involved in their vast undertaking. Experience is a handicap. The men who started the French Revolution were wholly without political experience. The same is true of the Bolsheviks, Nazis, and the revolutionaries in Asia.

Are People Thinking Less Than They Used To?

Some insightful comments by Steve Jobs on the information economy. Jobs argues that television is causing us to think less than we used to.

We live in an information economy, but I don’t believe we live in an information society. People are thinking less than they used to. It’s primarily because of television. People are reading less and they’re certainly thinking less. So, I don’t see most people using the Web to get more information. We’re already in information overload. No matter how much information the Web can dish out, most people get far more information than they can assimilate anyway.

Optimistic about people but not about groups

I’m an optimist in the sense that I believe humans are noble and honorable, and some of them are really smart. I have a very optimistic view of individuals. As individuals, people are inherently good. I have a somewhat more pessimistic view of people in groups. And I remain extremely concerned when I see what’s happening in our country, which is in many ways the luckiest place in the world. We don’t seem to be excited about making our country a better place for our kids.

Still curious? Read Walter Isaacson’s authorized biography on Steve Jobs.

Source

Promoting People In Organizations

In their 1978 paper Performance Sampling in Social Matches, researchers March and March discussed the implications of performance sampling for understanding careers in organizations. They came to some interesting conclusions with implications for those of us working in organizations.

Considerable evidence exists documenting that individuals confronted with problems requiring the estimation of proportions act as though sample size were substantially irrelevant to the reliability of their estimates. We do this in hiring all the time. Yet we know that sample size matters.

On how this cognitive bias affects hiring, March and March offer some good insights including the false record effect, the hero effect, the disappointment affect.

False Record Effect

A group of managers of identical (moderate) ability will show considerable variation in their performance records in the short run. Some will be found at one end of the distribution and will be viewed as outstanding; others will be at the other end and will be viewed as ineffective. The longer a manager stays in a job, the less the probable difference between the observed record of performance and actual ability. Time on the job increased the expected sample of observations, reduced expected sampling error, and thus reduced the chance that the manager (of moderate ability) will either be promoted or exit.

Hero Effect

Within a group of managers of varying abilities, the faster the rate of promotion, the less likely it is to be justified. Performance records are produced by a combination of underlying ability and sampling variation. Managers who have good records are more likely to have high ability than managers who have poor records, but the reliability of the differentiation is small when records are short.

Disappointment Effect

On the average, new managers will be a disappointment. The performance records by which managers are evaluated are subject to sampling error. Since a manager is promoted to a new job on the basis of a good previous record, the proportion of promoted managers whose past records are better than their abilities will be greater than the proportion whose past records are poorer. As a result, on the average, managers will do less well in their new jobs than they did in their old ones, and observers will come to believe that higher level jobs are more difficult than lower level ones, even if they are not.

…The present results reinforce the idea that indistinguishability among managers is a joint property of the individuals being evaluated and the process by which they are evaluated. Performance sampling models show how careers may be the consequences of erroneous interpretations of variations in performance produced by equivalent managers. But they also indicate that the same pattern of careers could be the consequence of unreliable evaluation of managers who do, in fact, differ, or of managers who do, in fact, learn over the course of their experience.

But hold on a second before you stop promoting new managers (who, by definition, have a limited sample size).

I’m not sure that sample size alone is the right way to think about this.

Consider two people: Manager A and Manager B who are up for promotion. Manager A has 10 years of experience and is an “all-star” (that is great performance with little variation in observations). Manager B, on the other hand, has only 5 years of experience but has shown a lot of variance in performance.

If you had to hire someone you’d likely pick A. But it’s important not to misinterpret the results of March and March and dig a little deeper.

What if we add one more variable to our two managers.

Manager A’s job has been “easy” whereas Manager B took a very “tough” assignment.

With this in mind, it seems reasonable to conclude that Manager B’s variance in performance could be explained by the difficulty of their task. This could also explain the lack of variance in Manager A’s performance.

Some jobs are tougher than others.

If you don’t factor in degree-of-difficulty you’re missing something big and sending a message to your workforce that discourages people from taking difficult assignments.

The importance of measuring performance over a meaningful sample size is the key to distinguishing between luck and skill. When in doubt go with the person that’s excelled with more variance in difficulty.