Category: Thought and Opinion

Opinion Warning Signs

Robin Hanson makes a list of “Signs that your opinions function more to signal loyalty and ability than to estimate truth:”

  1. You find it hard to be enthusiastic for something until you know that others oppose it.
  2. You have little interest in getting clear on what exactly is the position being argued.
  3. Realizing that a topic is important and neglected doesn’t make you much interested.
  4. You have little interest in digging to bigger topics behind commonly argued topics.
  5. You are less interested in a topic when you don’t foresee being able to talk about it.
  6. You are uncomfortable taking a position near the middle of the opinion distribution.
  7. You are uncomfortable taking a position of high uncertainty about who is right.
  8. You care far more about current nearby events than similar distant or past/future events.
  9. You find it easy to conclude that those who disagree with you are insincere or stupid.
  10. You are reluctant to change your publicly stated positions in response to new info.
  11. You are reluctant to agree a rival’s claim, even if you had no prior opinion on the topic.
  12. You are reluctant to take a position that raises the status of rivals.
  13. You care more about consistency between your beliefs than about belief accuracy.
  14. You go easy on sloppy arguments by folks on “your side.”
  15. You have little interest in practical concrete implications of commonly argued topics.
  16. Your opinion doesn’t much change after talking with smart folks who know more.
  17. You are especially eager to drop names when explaining positions and arguments.
  18. You find it hard to list weak points and counter-arguments on your positions.
  19. You feel passionately about a topic, but haven’t sought out much evidence.
  20. You are reluctant to not have an opinion on commonly discussed topics.
  21. More?

Tyler Cowen adds: You feel uncomfortable taking a position which raises the status of the people you usually disagree with.

Hiring and the Mismatch Problem

“We want to cling to these incredibly outdated and simplistic measures of ability.”
Malcolm Gladwell

***

Hiring is difficult and we tend to fall back on antiquated tools that give us a number (something, anything) to help us evaluate potential employees. This creates what Malcolm Gladwell calls “mismatch problems” — when the criteria for evaluating job candidates is out of step with the reality of the job demands.

Of course, we never think our criteria is out of step.

The mismatch problem shows itself all over the sports world. Although the study below was released in 2008, Gladwell has long illustrated the point that sports combines (events professional sports leagues hold for scouts to evaluate potential draftees based on a battery of ‘tests’) don’t work.

Gladwell’s results echo what Michael Lewis talks about in Moneyball: Combines are a poor predictor of determining ultimate success. Mismatch problems transcend the sports world.

Teachers are another example. While we tend to evaluate teachers based on high test scores, the number of degrees and other credentials, that makes little difference in how well people actually teach.

Some companies, like Google, are trying to attack this problem. Google tried to find correlations between ‘great’ existing employees. When they find correlations, say like most people who score 9/10 on performance reviews, own a dog, they try to work that into their hiring. By constantly evaluating the actual results of their hiring, rethinking how they hire, and removing questions and evaluations that show no bearing on actual performance they are taking steps to try to eliminate the mismatch problem.

Google also knows hiring lacks certainty; they are just trying to continuously improve and refine the process. Interestingly, very few workforces are so evidence-based. Rather the argument becomes hiring works because it has always ‘worked’…

So why do mismatch problems exist?

Because we desire certainty. We want to impose certainty on something that is not, by nature, certain. The increase in complexity doesn’t help either.

“The craving for that physics-style precision does nothing but get you in terrible trouble.”

See the video here.

Interested in learning more? Check out measurements that mislead.

Malcolm Gladwell is the New York Times bestselling author of Blink:The Power of Thinking Without ThinkingThe Tipping Point: How Little Things Can Make a Big DifferenceOutliers:The Story of Success, and What the Dog Saw: And Other Adventures.

Taleb: The Fooled by Randomness Effect and the Internet Diet?

In this brief article Nassim Taleb (of Black Swan fame) touches on information, complexity, the randomness effect, over-confidence, and signal and noise.

THE DEGRADATION OF PREDICTABILITY — AND KNOWLEDGE

I used to think that the problem of information is that it turns homo sapiens into fools — we gain disproportionately in confidence, particularly in domains where information is wrapped in a high degree of noise (say, epidemiology, genetics, economics, etc.). So we end up thinking that we know more than we do, which, in economic life, causes foolish risk taking. When I started trading, I went on a news diet and I saw things with more clarity. I also saw how people built too many theories based on sterile news, the fooled by randomness effect. But things are a lot worse. Now I think that, in addition, the supply and spread of information turns the world into Extremistan (a world I describe as one in which random variables are dominated by extremes, with Black Swans playing a large role in them). The Internet, by spreading information, causes an increase in interdependence, the exacerbation of fads (bestsellers like Harry Potter and runs on the banks become planetary). Such world is more “complex”, more moody, much less predictable.

So consider the explosive situation: more information (particularly thanks to the Internet) causes more confidence and illusions of knowledge while degrading predictability.

Look at this current economic crisis that started in 2008: there are about a million persons on the planet who identify themselves in the field of economics. Yet just a handful realized the possibility and depth of what could have taken place and protected themselves from the consequences. At no time in the history of mankind have we lived under so much ignorance (easily measured in terms of forecast errors) coupled with so much intellectual hubris. At no point have we had central bankers missing elementary risk metrics, like debt levels, that even the Babylonians understood well.

I recently talked to a scholar of rare wisdom and erudition, Jon Elster, who upon exploring themes from social science, integrates insights from all authors in the corpus of the past 2500 years, from Cicero and Seneca, to Montaigne and Proust. He showed me how Seneca had a very sophisticated understanding of loss aversion. I felt guilty for the time I spent on the Internet. Upon getting home I found in my mail a volume of posthumous essays by bishop Pierre-Daniel Huet called Huetiana, put together by his admirers c. 1722. It is so saddening to realize that, being born close to four centuries after Huet, and having done most of my reading with material written after his death, I am not much more advanced in wisdom than he was — moderns at the upper end are no wiser than their equivalent among the ancients; if anything, much less refined.

So I am now on an Internet diet, in order to understand the world a bit better — and make another bet on horrendous mistakes by economic policy makers. I am not entirely deprived of the Internet; this is just a severe diet, with strict rationing. True, technologies are the greatest things in the world, but they have way too monstrous side effects — and ones rarely seen ahead of time. And since spending time in the silence of my library, with little informational pollution, I can feel harmony with my genes; I feel I am growing again.

Related: Noise Vs. Signal

Source

Taleb: The Risk Externalities of Too Big to Fail

Too Big to Fail” is a dilemma that has plagued economists, policy makers and the public at large. In Nassim Taleb’s lastest paper (with co-author Charles S. Tapiero) he takes a look.

Abstract

This paper examines the risk externalities stemming from the size of institutions. Assuming (conservatively) that a firm risk exposure is limited to its capital while its external (and random) losses are unbounded we establish a condition for a firm to be too big to fail. In particular, expected risk externalities’ losses conditions for positive first and second derivatives with respect to the firm capital are derived. Examples and analytical results are obtained based on firms’ random effects on their external losses (their risk externalities) and policy implications are drawn that assess both the effects of “too big to fail firms” and their regulation.

The conclusion is worth reading even if you don’t read the paper — a small tease

However, the non- transparent bonuses that CEOs of large banks apply to themselves while not a factor in banks failure is a violation of the trust signaled by the incentives that banks have created to maintain the payments they distribute to themselves. For these reasons, too big too fail banks may entail too large too bear risk externalities. The result we have obtained indicate that this is a fact when banks internal risks have an extreme probability distribution (as this is often the case in VaR studies) and when external risks are an unbounded Pareto distribution.

Paper

Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments

We tend to feel we’re more able and smarter than we really are. We think we’re above average drivers, we’re above average investors, and we make better decisions than everyone else.

According to a recent study, this occurs, in part, because we “suffer a dual burden: Not only do these people reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the metacognitive ability to realize it.”

“Ignorance more frequently begets confidence than does knowledge.”

The study goes on to make several key points:

  • In many domains in life, success and satisfaction depend on knowledge, wisdom, or savvy in knowing which rules to follow and which strategies to pursue.
  • People differ widely in the knowledge and strategies they apply in these domains with varying levels of success. Some of the knowledge and theories that people apply to their actions are sound and meet with favorable results.
  • When people are incompetent in the strategies they adopt to achieve success and satisfaction, they suffer a dual burden: Not only do they reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the ability to realize it.

The authors come to the conclusions that the skills we need to have competence in any domain are often the same skills we need to accurately evaluate competence. The better we are at something, the better we’re able to judge ourselves. Because of this, incompetent individuals often exaggerate their ability more than competent ones.

The Mis-Match Problem

In this video, Malcolm Gladwell speaks on the challenge of hiring in the modern world.

One of those challenges, the mis-match problem, happens when we use criteria to judge someone for a job that is radically out of step with the actual demands of the job itself. Despite our best intentions we do this all of the time. Gladwell says “we want to cling to these incredibly outdated and simplistic measures of ability.”

Why do mis-match problems exist?
1. Our desire for certainty — the desire to impose certainty on something that is not certain.
2. Increase in complexity in professions.

“The craving for that physics-style precision does nothing but get you in terrible trouble.”

See more on the mis-match problem.

Malcolm Gladwell is a staff writer at the New Yorker and the author of The Tipping Point: How Little Things Make a Big Difference, Blink, Outliers and most recently, What the Dog Saw.