Tag: Learning

Learning How to Learn: My Conversation With Barbara Oakley

In this interview, Barbara Oakley, 8-time author and creator of Learning to Learn, an online course with over a million enrolled students, shares the science and strategies to learn more quickly, overcome procrastination and get better at practically anything.

***

Just when I start to think I’m using my time well and getting a lot done in my life, I meet someone like Barbara Oakley.

Barbara is a true polymath. She was a captain in the U.S. Army, a Russian translator on Soviet trawlers, a radio operator in the South Pole, an engineer, university professor, researcher and the author of 8 books.

Oh, and she is also the creator and instructor of Learning to Learn, the most popular Massive Open Online Course (MOOC) ever(!), with over one million enrolled students.

In this fascinating interview, we cover many aspects of learning, including how to make it stick so we remember more and forget less, how to be more efficient so we learn more quickly, and how to remove that barriers that get in the way of effective learning.

Specifically, Barbara covers:

  • How she changed her brain from hating math and science to loving it so much she now teaches engineering to college students
  • What neuroscience can tell us about how to learn more effectively
  • The two modes of your brain and how that impacts what and how you learn
  • Why backing off can sometimes be the best thing you can do when learning something new
  • How to “chunk” your learning so new knowledge is woven into prior knowledge making it easily accessible
  • The best ways to develop new patterns of learning in our brains
  • How to practice a skill so you can blast through plateaus and improve more quickly
  • Her favorite tactic for dealing with procrastination so you can spend more time learning
  • The activities she recommends that rapidly increase neural connections like fertilizer on the brain
  • Whether memorization has a place in learning anymore, or simply a barrier to true understanding
  • The truth about “learning types” and how identifying as a visual or auditory learner might be setting yourself up for failure.

…and a whole lot more.

If you want to be the most efficient learner you can be, and have more fun doing it, you won’t want to miss this discussion.

Listen

Transcript
An edited copy of this transcript is available to members of our learning community or for purchase separately ($7).

If you liked this, check out all the episodes of the knowledge project.

***

Members can discuss this on the Learning Community Forum.

Half Life: The Decay of Knowledge and What to Do About It

Understanding the concept of a half-life will change what you read and how you invest your time. It will explain why our careers are increasingly specialized and offer a look into how we can compete more effectively in a very crowded world.

The Basics

A half-life is the time taken for something to halve its quantity. The term is most often used in the context of radioactive decay, which occurs when unstable atomic particles lose energy. Twenty-nine elements are known to be capable of undergoing this process. Information also has a half-life, as do drugs, marketing campaigns, and all sorts of other things. We see the concept in any area where the quantity or strength of something decreases over time.

Radioactive decay is random, and measured half-lives are based on the most probable rate. We know that a nucleus will decay at some point; we just cannot predict when. It could be anywhere between instantaneous and the total age of the universe. Although scientists have defined half-lives for different elements, the exact rate is completely random.

Half-lives of elements vary tremendously. For example, carbon takes millions of years to decay; that’s why it is stable enough to be a component of the bodies of living organisms. Different isotopes of the same element can also have different half-lives.

Three main types of nuclear decay have been identified: alpha, beta, and gamma. Alpha decay occurs when a nucleus splits into two parts: a helium nucleus and the remainder of the original nucleus. Beta decay occurs when a neutron in the nucleus of an element changes into a proton. The result is that it turns into a different element, such as when potassium decays into calcium. Beta decay also releases a neutrino — a particle with virtually no mass. If a nucleus emits radiation without experiencing a change in its composition, it is subject to gamma decay. Gamma radiation contains an enormous amount of energy.

The Discovery of Half-Lives

The discovery of half-lives (and alpha and beta radiation) is credited to Ernest Rutherford, one of the most influential physicists of his time. Rutherford was at the forefront of this major discovery when he worked with physicist Joseph John Thompson on complementary experiments leading to the discovery of electrons. Rutherford recognized the potential of what he was observing and began researching radioactivity. Two years later, he identified the distinction between alpha and beta rays. This led to his discovery of half-lives, when he noticed that samples of radioactive materials took the same amount of time to decay by half. By 1902, Rutherford and his collaborators had a coherent theory of radioactive decay (which they called “atomic disintegration”). They demonstrated that radioactive decay enabled one element to turn into another — research which would earn Rutherford a Nobel Prize. A year later, he spotted the missing piece in the work of the chemist Paul Villard and named the third type of radiation gamma.

Half-lives are based on probabilistic thinking. If the half-life of an element is seven days, it is most probable that half of the atoms will have decayed in that time. For a large number of atoms, we can expect half-lives to be fairly consistent. It’s important to note that radioactive decay is based on the element itself, not the quantity of it. By contrast, in other situations, the half-life may vary depending on the amount of material. For example, the half-life of a chemical someone ingests might depend on the quantity.

In biology, a half-life is the time taken for a substance to lose half its effects. The most obvious instance is drugs; the half-life is the time it takes for their effect to halve, or for half of the substance to leave the body. The half-life of caffeine is around 6 hours, but (as with most biological half-lives) numerous factors can alter that number. People with compromised liver function or certain genes will take longer to metabolize caffeine. Consumption of grapefruit juice has been shown in some studies to slow caffeine metabolism. It takes around 24 hours for a dose of caffeine to fully leave the body.

The half-lives of drugs vary from a few seconds to several weeks. To complicate matters, biological half-lives vary for different parts of the body. Lead has a half-life of around a month in the blood, but a decade in bone. Plutonium in bone has a half-life of a century — more than double the time for the liver.

Marketers refer to the half-life of a campaign — the time taken to receive half the total responses. Unsurprisingly, this time varies among media. A paper catalog may have a half-life of about three weeks, whereas a tweet might have a half-life of a few minutes. Calculating this time is important for establishing how frequently a message should be sent.

“Every day that we read the news we have the possibility of being confronted with a fact about our world that is wildly different from what we thought we knew.”

— Samuel Arbesman

The Half-Life of Facts

In The Half-Life of Facts: Why Everything We Know Has an Expiration Date, Samuel Arbesman (see our Knowledge Project interview) posits that facts decay over time until they are no longer facts or perhaps no longer complete. According to Arbesman, information has a predictable half-life: the time taken for half of it to be replaced or disproved. Over time, one group of facts replaces another. As our tools and knowledge become more advanced, we can discover more — sometimes new things that contradict what we thought we knew, sometimes nuances about old things. Sometimes we discover a whole area that we didn’t know about.

The rate of these discoveries varies. Our body of engineering knowledge changes more slowly, for example, than does our body of psychological knowledge.

Arbesman studied the nature of facts. The field was born in 1947, when mathematician Derek J. de Solla Price was arranging a set of philosophical books on his shelf. Price noted something surprising: the sizes of the books fit an exponential curve. His curiosity piqued, he began to see whether the same curve applied to science as a whole. Price established that the quantity of scientific data available was doubling every 15 years. This meant that some of the information had to be rendered obsolete with time.

Scientometrics shows us that facts are always changing, and much of what we know is (or soon will be) incorrect. Indeed, much of the available published research, however often it is cited, has never been reproduced and cannot be considered true. In a controversial paper entitled “Why Most Published Research Findings Are False,” John Ioannides covers the rampant nature of poor science. Many researchers are incentivized to find results that will please those giving them funding. Intense competition makes it essential to find new information, even if it is found in a dubious manner. Yet we all have a tendency to turn a blind eye when beliefs we hold dear are disproved and to pay attention only to information confirming our existing opinions.

As an example, Arbesman points to the number of chromosomes in a human cell. Up until 1965, 48 was the accepted number that medical students were taught. (In 1953, it had been declared an established fact by a leading cytologist). Yet in 1956, two researchers, Joe Hin Tjio and Albert Levan, made a bold assertion. They declared the true number to be 46. During their research, Tjio and Levan could never find the number of chromosomes they expected. Discussing the problem with their peers, they discovered they were not alone. Plenty of other researchers found themselves two chromosomes short of the expected 48. Many researchers even abandoned their work because of this perceived error. But Tjio and Levan were right (for now, anyway). Although an extra two chromosomes seems like a minor mistake, we don’t know the opportunity costs of the time researchers invested in faulty hypotheses or the value of the work that was abandoned. It was an emperor’s-new-clothes situation, and anyone counting 46 chromosomes assumed they were the ones making the error.

As Arbesman puts it, facts change incessantly. Many of us have seen the ironic (in hindsight) doctor-endorsed cigarette ads from the past. A glance at a newspaper will doubtless reveal that meat or butter or sugar has gone from deadly to saintly, or vice versa. We forget that laughable, erroneous beliefs people once held are not necessarily any different from those we now hold. The people who believed that the earth was the center of the universe, or that some animals appeared out of nowhere or that the earth was flat, were not stupid. They just believed facts that have since decayed. Arbesman gives the example of a dermatology test that had the same question two years running, with a different answer each time. This is unsurprising considering the speed at which our world is changing.

As Arbesman points out, in the last century the world’s population has swelled from 2 billion to 7 billion, we have taken on space travel, and we have altered the very definition of science.

Our world seems to be in constant flux. With our knowledge changing all the time, even the most informed people can barely keep up. All this change may seem random and overwhelming (Dinosaurs have feathers? When did that happen?), but it turns out there is actually order within the shifting noise. This order is regular and systematic and is one that can be described by science and mathematics.

The order Arbesman describes mimics the decay of radioactive elements. Whenever new information is discovered, we can be sure it will break down and be proved wrong at some point. As with a radioactive atom, we don’t know precisely when that will happen, but we know it will occur at some point.

If we zoom out and look at a particular body of knowledge, the random decay becomes orderly. Through probabilistic thinking, we can predict the half-life of a group of facts with the same certainty with which we can predict the half-life of a radioactive atom. The problem is that we rarely consider the half-life of information. Many people assume that whatever they learned in school remains true years or decades later. Medical students who learned in university that cells have 48 chromosomes would not learn later in life that this is wrong unless they made an effort to do so.

OK, so we know that our knowledge will decay. What do we do with this information? Arbesman says,

… simply knowing that knowledge changes like this isn’t enough. We would end up going a little crazy as we frantically tried to keep up with the ever changing facts around us, forever living on some sort of informational treadmill. But it doesn’t have to be this way because there are patterns. Facts change in regular and mathematically understandable ways. And only by knowing the pattern of our knowledge evolution can we be better prepared for its change.

Recent initiatives have sought to calculate the half-life of an academic paper. Ironically, academic journals have largely neglected research into how people use them and how best to fund the efforts of researchers. Research by Philip Davis shows the time taken for a paper to receive half of its total downloads. Davis’s results are compelling. While most forms of media have a half-life measured in days or even hours, 97 percent of academic papers have a half-life longer than a year. Engineering papers have a slightly shorter half-life than other fields of research, with double the average (6 percent) having a half-life of under a year. This makes sense considering what we looked at earlier in this post. Health and medical publications have the shortest overall half-life: two to three years. Physics, mathematics, and humanities publications have the longest half-lives: two to four years.

The Half-Life of Secrets

According to Peter Swire, writing in “The Declining Half-Life of Secrets,” the half-life of secrets (by which Swire generally means classified information) is shrinking. In the past, a government secret could be kept for over 25 years. Nowadays, hacks and leaks have shrunk that time considerably. Swire writes:

During the Cold War, the United States developed the basic classification system that exists today. Under Executive Order 13526, an executive agency must declassify its documents after 25 years unless an exception applies, with stricter rules if documents stay classified for 50 years or longer. These time frames are significant, showing a basic mind-set of keeping secrets for a time measured in decades.

Swire notes that there are three main causes: “the continuing effects of Moore’s Law — or the idea that computing power doubles every two years, the sociology of information technologists, and the different source and methods for signals intelligence today compared with the Cold War.” One factor is that spreading leaked information is easier than ever. In the past, it was often difficult to get information published. Newspapers feared legal repercussions if they shared classified information. Anyone can now release secret information, often anonymously, as with WikiLeaks. Governments cannot as easily rely on media gatekeepers to cover up leaks.

Rapid changes in technology or geopolitics often reduce the value of classified information, so the value of some, but not all, classified information also has a half-life. Sometimes it’s days or weeks, and sometimes it’s years. For some secrets, it’s not worth investing the massive amount of computer time that would be needed to break them because by the time you crack the code, the information you wanted to know might have expired.

(As an aside, if you were to invert the problem of all these credit card and SSN leaks, you might conclude that reducing the value of possessing this information would be more effective than spending money to secure it.)

“Our policy (at Facebook) is literally to hire as many talented engineers as we can find. The whole limit in the system is that there are not enough people who are trained and have these skills today.”

— Mark Zuckerberg

The Half-Lives of Careers and Business Models

The issue with information having a half-life should be obvious. Many fields depend on individuals with specialized knowledge, learned through study or experience or both. But what if those individuals are failing to keep up with changes and clinging to outdated facts? What if your doctor is offering advice that has been rendered obsolete since they finished medical school? What if your own degree or qualifications are actually useless? These are real problems, and knowing about half-lives will help you make yourself more adaptable.

While figures for the half-lives of most knowledge-based careers are hard to find, we do know the half-life of an engineering career. A century ago, it would take 35 years for half of what an engineer learned when earning their degree to be disproved or replaced. By the 1960s, that time span shrank to a mere decade. Today that figure is probably even lower.

In 1966 paper entitled “The Dollars and Sense of Continuing Education,” Thomas Jones calculated the effort that would be required for an engineer to stay up to date, assuming a 10-year half-life. According to Jones, an engineer would need to devote at least five hours per week, 48 weeks a year, to stay up to date with new advancements. A typical degree requires about 4800 hours of work. Within 10 years, the information learned during 2400 of those hours would be obsolete. The five-hour figure does not include the time necessary to revise forgotten information that is still relevant. A 40-year career as an engineer would require 9600 hours of independent study.

Keep in mind that Jones made his calculations in the 1960s. Modern estimates place the half-life of an engineering degree at between 2.5 and 5 years, requiring between 10 and 20 hours of study per week. Welcome to the treadmill, where you have to run faster and faster so that you don’t fall behind.

Unsurprisingly, putting in this kind of time is simply impossible for most people. The result is an ever-shrinking length of a typical engineer’s career and a bias towards hiring recent graduates. A partial escape from this time-consuming treadmill that offers little progress is to recognize the continuous need for learning. If you agree with that, it becomes easier to place time and emphasis on developing heuristics and systems to foster learning. The faster the pace of knowledge change, the more valuable the skill of learning becomes.

A study by PayScale found that the median age of workers in most successful technology companies is substantially lower than that of other industries. Of 32 companies, just six had a median worker age above 35, despite the average across all workers being just over 42. Eight of the top companies had a median worker age of 30 or below — 28 for Facebook, 29 for Google, and 26 for Epic Games. The upshot is that salaries are high for those who can stay current while gaining years of experience.

In a similar vein, business models have ever shrinking half-lives. The nature of capitalism is that you have to be better last year than you were this year — not to gain market share but to maintain what you already have. If you want to get ahead, you need asymmetry; otherwise, you get lost in trench warfare. How long would it take for half of Uber or Facebook’s business models to be irrelevant? It’s hard to imagine it being more than a couple of years or even months.

In The Business Model Innovation Factory: How to Stay Relevant When the World Is Changing, Saul Kaplan highlights the changing half-lives of business models. In the past, models could last for generations. The majority of CEOs oversaw a single business for their entire careers. Business schools taught little about agility or pivoting. Kaplan writes:

During the industrial era once the basic rules for how a company creates, delivers, and captures value were established[,] they became etched in stone, fortified by functional silos, and sustained by reinforcing company cultures. All of a company’s DNA, energy, and resources were focused on scaling the business model and beating back competition attempting to do a better job executing the same business model. Companies with nearly identical business models slugged it out for market share within well-defined industry sectors.

[…]

Those days are over. The industrial era is not coming back. The half-life of a business model is declining. Business models just don’t last as long as they used to. In the twenty-first century business leaders are unlikely to manage a single business for an entire career. Business leaders are unlikely to hand down their businesses to the next generation of leaders with the same business model they inherited from the generation before.

The Burden of Knowledge

The flip side of a half-life is the time it takes to double something. A useful guideline to calculate the time it takes for something to double is to divide 70 by the rate of growth. This formula isn’t perfect, but it gives a good indication. Known as the Rule of 70, it applies only to exponential growth when the relative growth rate remains consistent, such as with compound interest.

The higher the rate of growth, the shorter the doubling time. For example, if the population of a city is increasing by 2 percent per year, we divide 70 by 2 to get a doubling time of 35 years. The rule of 70 is a useful heuristic; population growth of 2 percent might seem low, but your perspective might change when you consider that the city’s population could double in just 35 years. The Rule of 70 can also be used to calculate the time for an investment to double in value; for example, $100 at 7 percent compound interest will double in just a decade and quadruple in 20 years. The average newborn baby doubles its birth weight in under four months. The average doubling time for a tumor is also four months.

We can see how information changes in the figures for how long it takes for a body of knowledge to double in size. The figures quoted by Arbesman (drawn from Little Science, Big Science … and Beyond by Derek J. de Solla Price) are compelling, including:

  • Time for the number of entries in a dictionary of national biographies to double: 100 years
  • Time for the number of universities to double: 50 years
  • Time for the number of known chemical compounds to double: 15 years
  • Time for the number of known asteroids to double: 10 years

Arbesman also gives figures for the time taken for the available knowledge in a particular field to double, including:

  • Medicine: 87 years
  • Mathematics: 63 years
  • Chemistry: 35 years
  • Genetics: 32 years

The doubling of knowledge increases the learning load over time. As a body of knowledge doubles so does the cost of wrapping your head around what we already know. This cost is the burden of knowledge. To be the best in a general field today requires that you know more than the person who was the best only 20 years ago. Not only do you have to be better to be the best, but you also have to be better just to stay in the game.

The corollary is that because there is so much to know, we specialize in very niche areas. This makes it easier to grasp the existing body of facts, keep up to date on changes, and rise to the level of expert. The problem is that specializing also makes it easier to see the world through the narrow focus of your specialty, makes it harder to work with other people (as niches are often dominated by jargon), and makes you prone to overvalue the new and novel.

Conclusion

As we have seen, understanding how half-lives work has numerous practical applications, from determining when radioactive materials will become safe to figuring out effective drug dosages. Half-lives also show us that if we spend time learning something that changes quickly, we might be wasting our time. Like Alice in Wonderland — and a perfect example of the Red Queen Effect — we have to run faster and faster just to keep up with where we are. So if we want our knowledge to compound, we’ll need to focus on the invariant general principles.

***

Members can discuss this post on the Learning Community Forum.

The Most Respectful Interpretation

Consider this situation: You email a colleague with a question expecting a prompt response, but hours or days later you’ve yet to hear from them. Perhaps you can’t move forward on your project without their input so you find yourself blocked. How do you imagine you feel in this situation?

For many of us, situations like this result in feelings of anger, frustration, or annoyance. Maybe we take it personally and conclude that our colleague is lazy or that they don’t value our time or our work. Perhaps we send off a terse reminder asking for an update.

If we’re feeling particularly revengeful, we alert the person’s manager or mention our grievance to another colleague looking for validation that the offending colleague is in fact lazy and disrespectful – a form of confirmation bias.

Perhaps this colleague has been slow to respond to communications in the past, thus we extrapolate that to all of their communications, a case of the fundamental attribution error.

Of course, it’s natural to feel anger and frustration when faced with these situations. But is anger the appropriate response?

In the Nicomachean Ethics Aristotle wrote about The Virtue Concerned with Anger. He begins Book IV with a description of good temper:

The man who is angry at the right things and with the right people, and, further, as he ought, when he ought, and as long as he ought, is praised. This will be the good-tempered man, then, since good temper is praised.

Aristotle tells us that anger has a time and place and that when applied to the right people and for the right reason, is justified and even praiseworthy. But we have to use anger judiciously:

For the good-tempered man tends to be unperturbed and not to be led by passion, but to be angry in the manner, at the things, and for the length of time, that reason dictates; but he is thought to err rather in the direction of deficiency; for the good-tempered man is not revengeful, but rather tends to make allowances.

In Aristotle’s description of good temper, he encourages us to err in the direction of “making allowances”. But how can we do this in practice?

Let’s return to our example.

We take our colleague’s lack of response personally and assume they are lazy or disrespectful, but it is important for us to recognize that we are assuming. We often instinctively chose to assume the worst of people, because it slips easily into mind. But what if instead we chose to assume the best?

In her book Rising Strong, Brené Brown describes how she learned to assume that people are doing the best they can and shares a concept introduced to her by Dr. Jean Kantambu Latting, a professor at University of Houston. Brown writes:

Whenever someone would bring up a conflict with a colleague, she would ask, ‘What is the hypothesis of generosity? What is the most generous assumption you can make about this person’s intentions or what this person said?’

By pausing to reflect on our anger we can recognize that we are making a negative assumption and challenge ourselves to invert the situation and consider the opposite: “What is the most generous assumption I can make?”

Perhaps our colleague has been given a higher priority project, or they don’t understand that we’re blocked without their input. Maybe they are dealing with some personal challenges outside of the office, or they need input from somebody else to reply to our message and thus they’re blocked as well. Perhaps they’ve decided to reduce their email frequency in order to focus on important work.

When we pause to look at the situation from another angle, not only do we entertain some explanations that frame our colleagues in a more positive light, but we put ourselves into their shoes; the very definition of empathy.

We’ve all had competing priorities, distractions from personal issues outside of work, miscommunications regarding the urgent need of our response, etc. Do we think others judged us fairly or unfairly in those moments?

The point is not to make excuses or avoid addressing problems with our colleagues, but that if we recognize we are making negative assumptions by default, we might need to challenge ourselves to consider more generous alternatives. This may alter the way we approach our colleague to address the situation. It takes effort and a commitment to think about people differently.

Someone who knew this best was the late, great author David Foster Wallace.

***

In his beautiful commencement speech to the Kenyon graduating class of 2005, Wallace reminds the students that the old cliché of liberal arts education teaching you to think is truer than they might want to believe. He warns that one of the biggest challenges the graduates will face in life is to challenge their self-centered view of the world – a view that we all have by default.

Using some of life’s more mundane and annoying activities like shopping and commuting, Wallace writes:

The point here is that I think this is one part of what teaching me how to think is really supposed to mean. To be just a little less arrogant. To have just a little critical awareness about myself and my certainties. Because a huge percentage of the stuff that I tend to be automatically certain of is, it turns out, totally wrong and deluded. I have learned this the hard way, as I predict you graduates will, too.

Here is just one example of the total wrongness of something I tend to be automatically sure of: everything in my own immediate experience supports my deep belief that I am the absolute centre of the universe; the realest, most vivid and important person in existence. We rarely think about this sort of natural, basic self-centeredness because it’s so socially repulsive. But it’s pretty much the same for all of us. It is our default setting, hard-wired into our boards at birth. Think about it: there is no experience you have had that you are not the absolute centre of. The world as you experience it is there in front of YOU or behind YOU, to the left or right of YOU, on YOUR TV or YOUR monitor. And so on. Other people’s thoughts and feelings have to be communicated to you somehow, but your own are so immediate, urgent, real.

Please don’t worry that I’m getting ready to lecture you about compassion or other-directedness or all the so-called virtues. This is not a matter of virtue. It’s a matter of my choosing to do the work of somehow altering or getting free of my natural, hard-wired default setting which is to be deeply and literally self-centered and to see and interpret everything through this lens of self. People who can adjust their natural default setting this way are often described as being “well-adjusted”, which I suggest to you is not an accidental term.

The recognition that we are inherently self-centered and that this affects the way in which we interpret the world seems so obvious when pointed out, but how often do we stop to consider it? This is our hard-wired default setting, so it’s quite a challenge to become willing to think differently.

As an example, Wallace describes a situation where he is disgusted by the gas guzzling Hummer in front of him in traffic. The idea of these cars offends him and he starts making assumptions about the drivers: they’re wasteful, inconsiderate of the planet, and inconsiderate of future generations.

Look, if I choose to think this way in a store and on the freeway, fine. Lots of us do. Except thinking this way tends to be so easy and automatic that it doesn’t have to be a choice. It is my natural default setting. It’s the automatic way that I experience the boring, frustrating, crowded parts of adult life when I’m operating on the automatic, unconscious belief that I am the centre of the world, and that my immediate needs and feelings are what should determine the world’s priorities.

But then he challenges himself to consider alternative interpretations, something often described as making the Most Respectful Interpretation (MRI). Wallace decides to consider more respectful interpretations of the other drivers – maybe they have a legitimate need to be driving a large SUV or to be rushing through traffic.

In this traffic, all these vehicles stopped and idling in my way, it’s not impossible that some of these people in SUV’s have been in horrible auto accidents in the past, and now find driving so terrifying that their therapist has all but ordered them to get a huge, heavy SUV so they can feel safe enough to drive. Or that the Hummer that just cut me off is maybe being driven by a father whose little child is hurt or sick in the seat next to him, and he’s trying to get this kid to the hospital, and he’s in a bigger, more legitimate hurry than I am: it is actually I who am in HIS way.

Again, please don’t think that I’m giving you moral advice, or that I’m saying you’re “supposed to” think this way, or that anyone expects you to just automatically do it, because it’s hard, it takes will and mental effort, and if you’re like me, some days you won’t be able to do it, or you just flat out won’t want to. But most days, if you’re aware enough to give yourself a choice, you can choose to look differently at this fat, dead-eyed, over-made-up lady who just screamed at her kid in the checkout line. Maybe she’s not usually like this. Maybe she’s been up three straight nights holding the hand of a husband who is dying of bone cancer. Or maybe this very lady is the low-wage clerk at the motor vehicle department, who just yesterday helped your spouse resolve a horrific, infuriating, red-tape problem through some small act of bureaucratic kindness. Of course, none of this is likely, but it’s also not impossible. It just depends what you want to consider. If you’re automatically sure that you know what reality is, and you are operating on your default setting, then you, like me, probably won’t consider possibilities that aren’t annoying and miserable. But if you really learn how to pay attention, then you will know there are other options.

A big part of learning to think is recognizing our default reactions and responses to situations — the so-called “System 1” thinking espoused by Daniel Kahneman. Learning to be “good-tempered” and “well-adjusted” requires us to try to be more self-aware, situationally aware, and to acknowledge our self-centered nature; to put the brakes on and use System 2 instead.

So the next time you find yourself annoyed with your colleagues, angry at other drivers on the road, or judgmental about people standing in line at the store, use it as an opportunity to challenge your negative assumptions and try to interpret the situation in a more respectful and generous way. You might eventually realize that the broccoli tastes good.

A Short List of Books for Doing New Things

Andrew Ng has quite the modern resume.

He founded Coursera, a wonderful website that gives anyone with Internet access the ability to take high level university courses on almost any topic. He founded the Google Brain project at Google, their deep learning research project intended to help bring about better artificial intelligence. Now he’s the Chief Scientist at Baidu Research.

Ng is, unsurprisingly, devoted to reading and learning. As he puts it,

In my own life, I found that whenever I wasn’t sure what to do next, I would go and learn a lot, read a lot, talk to experts. I don’t know how the human brain works but it’s almost magical: when you read enough or talk to enough experts, when you have enough inputs, new ideas start appearing. This seems to happen for a lot of people that I know.

When you become sufficiently expert in the state of the art, you stop picking ideas at random. You are thoughtful in how to select ideas, and how to combine ideas. You are thoughtful about when you should be generating many ideas versus pruning down ideas.

[…]

I read a lot and I also spend time talking to people a fair amount. I think two of the most efficient ways to learn, to get information, are reading and talking to experts. So I spend quite a bit of time doing both of them. I think I have just shy of a thousand books on my Kindle. And I’ve probably read about two-thirds of them.

Ng thinks innovation and creativity can be learned — that they are pattern-recognition and combinatorial creativity exercises which can be performed by an intelligent and devoted practitioner with the right approach.

He also encourages the creation of new things; new businesses, new technologies. And on that topic, Ng has a few book recommendations. Given his list of accomplishments, the quality of his mind, and his admitted devotion to reading the printed word, it seems worth our time to check out the list.

***

Zero to One

The first is “Zero to One” by Peter Thiel, a very good book that gives an overview of entrepreneurship and innovation.

Crossing the Chasm / The Lean Startup

We often break down entrepreneurship into B2B (“business to business,” i.e., businesses whose customers are other businesses) and B2C (“business to consumer”).

For B2B, I recommend “Crossing the Chasm.” For B2C, one of my favorite books is “The Lean Startup,” which takes a narrower view but it gives one specific tactic for innovating quickly. It’s a little narrow but it’s very good in the area that it covers.

Talking to Humans

Then to break B2C down even further, two of my favorites are “Talking to Humans,” which is a very short book that teaches you how to develop empathy for users you want to serve by talking to them.

Rocket Surgery Made Easy

Also, “Rocket Surgery Made Easy.” If you want to build products that are important, that users care about, this teaches you different tactics for learning about users, either through user studies or by interviews.

The Hard Thing about Hard Things

Then finally there is “The Hard Thing about Hard Things.” It’s a bit dark but it does cover a lot of useful territory on what building an organization is like.

So Good They Can’t Ignore You

For people who are trying to figure out career decisions, there’s a very interesting one: “So Good They Can’t Ignore You.” That gives a valuable perspective on how to select a path for one’s career.

How To Mentally Overachieve — Charles Darwin’s Reflections On His Own Mind

We’ve written quite a bit about the marvelous British naturalist Charles Darwin, who with his Origin of Species created perhaps the most intense intellectual debate in human history, one which continues up to this day.

Darwin’s Origin was a courageous and detailed thought piece on the nature and development of biological species. It’s the starting point for nearly all of modern biology.

But, as we’ve noted before, Darwin was not a man of pure IQ. He was not Issac Newton, or Richard Feynman, or Albert Einstein — breezing through complex mathematical physics at a young age.

Charlie Munger thinks Darwin would have placed somewhere in the middle of a good private high school class. He was also in notoriously bad health for most of his adult life and, by his son’s estimation, a terrible sleeper. He really only worked a few hours a day in the many years leading up to the Origin of Species.

Yet his “thinking work” outclassed almost everyone. An incredible story.

In his autobiography, Darwin reflected on this peculiar state of affairs. What was he good at that led to the result? What was he so weak at? Why did he achieve better thinking outcomes? As he put it, his goal was to:

“Try to analyse the mental qualities and the conditions on which my success has depended; though I am aware that no man can do this correctly.”

In studying Darwin ourselves, we hope to better appreciate our own strengths and weaknesses and, not to mention understand the working methods of a “mental overachiever.

Let’s explore what Darwin saw in himself.

***

1. He did not have a quick intellect or an ability to follow long, complex, or mathematical reasoning. He may have been a bit hard on himself, but Darwin realized that he wasn’t a “5 second insight” type of guy (and let’s face it, most of us aren’t). His life also proves how little that trait matters if you’re aware of it and counter-weight it with other methods.

I have no great quickness of apprehension or wit which is so remarkable in some clever men, for instance, Huxley. I am therefore a poor critic: a paper or book, when first read, generally excites my admiration, and it is only after considerable reflection that I perceive the weak points. My power to follow a long and purely abstract train of thought is very limited; and therefore I could never have succeeded with metaphysics or mathematics. My memory is extensive, yet hazy: it suffices to make me cautious by vaguely telling me that I have observed or read something opposed to the conclusion which I am drawing, or on the other hand in favour of it; and after a time I can generally recollect where to search for my authority. So poor in one sense is my memory, that I have never been able to remember for more than a few days a single date or a line of poetry.

2. He did not feel easily able to write clearly and concisely. He compensated by getting things down quickly and then coming back to them later, thinking them through again and again. Slow, methodical….and ridiculously effective: For those who haven’t read it, the Origin of Species is extremely readable and clear, even now, 150 years later.

I have as much difficulty as ever in expressing myself clearly and concisely; and this difficulty has caused me a very great loss of time; but it has had the compensating advantage of forcing me to think long and intently about every sentence, and thus I have been led to see errors in reasoning and in my own observations or those of others.

There seems to be a sort of fatality in my mind leading me to put at first my statement or proposition in a wrong or awkward form. Formerly I used to think about my sentences before writing them down; but for several years I have found that it saves time to scribble in a vile hand whole pages as quickly as I possibly can, contracting half the words; and then correct deliberately. Sentences thus scribbled down are often better ones than I could have written deliberately.

3. He forced himself to be an incredibly effective and organized collector of information. Darwin’s system of reading and indexing facts in large portfolios is worth emulating, as is the habit of taking down conflicting ideas immediately.

As in several of my books facts observed by others have been very extensively used, and as I have always had several quite distinct subjects in hand at the same time, I may mention that I keep from thirty to forty large portfolios, in cabinets with labelled shelves, into which I can at once put a detached reference or memorandum. I have bought many books, and at their ends I make an index of all the facts that concern my work; or, if the book is not my own, write out a separate abstract, and of such abstracts I have a large drawer full. Before beginning on any subject I look to all the short indexes and make a general and classified index, and by taking the one or more proper portfolios I have all the information collected during my life ready for use.

4. He had possibly the most valuable trait in any sort of thinker: A passionate interest in understanding reality and putting it in useful order in his headThis “Reality Orientation” is hard to measure and certainly does not show up on IQ tests, but probably determines, to some extent, success in life.

On the favourable side of the balance, I think that I am superior to the common run of men in noticing things which easily escape attention, and in observing them carefully. My industry has been nearly as great as it could have been in the observation and collection of facts. What is far more important, my love of natural science has been steady and ardent.

This pure love has, however, been much aided by the ambition to be esteemed by my fellow naturalists. From my early youth I have had the strongest desire to understand or explain whatever I observed,–that is, to group all facts under some general laws. These causes combined have given me the patience to reflect or ponder for any number of years over any unexplained problem. As far as I can judge, I am not apt to follow blindly the lead of other men. I have steadily endeavoured to keep my mind free so as to give up any hypothesis, however much beloved (and I cannot resist forming one on every subject), as soon as facts are shown to be opposed to it.

Indeed, I have had no choice but to act in this manner, for with the exception of the Coral Reefs, I cannot remember a single first-formed hypothesis which had not after a time to be given up or greatly modified. This has naturally led me to distrust greatly deductive reasoning in the mixed sciences. On the other hand, I am not very sceptical—a frame of mind which I believe to be injurious to the progress of science. A good deal of scepticism in a scientific man is advisable to avoid much loss of time, but I have met with not a few men, who, I feel sure, have often thus been deterred from experiment or observations, which would have proved directly or indirectly serviceable.

[…]

Therefore my success as a man of science, whatever this may have amounted to, has been determined, as far as I can judge, by complex and diversified mental qualities and conditions. Of these, the most important have been—the love of science—unbounded patience in long reflecting over any subject—industry in observing and collecting facts—and a fair share of invention as well as of common sense.

5. Most inspirational to us of average intellect, he outperformed his own mental aptitude with these good habits, surprising even himself with the results.

With such moderate abilities as I possess, it is truly surprising that I should have influenced to a considerable extent the belief of scientific men on some important points.

***

Still Interested? Read his autobiography, his The Origin of Species, or check out David Quammen’s wonderful short biography of the most important period of Darwin’s life. Also, if you missed it, check out our prior post on Darwin’s Golden Rule.

Ask Farnam Street #1

Welcome to the first incarnation of Ask Farnam Streetwhere we’ll be taking and answering questions on anything you’re curious about that we feel we can answer competently and honestly. This first batch of questions comes straight from our Members.

If you’d like to submit a question for our next Q&A, please send it to us at [email protected] with the title “Ask Farnam Street.” We will choose a group of the most thoughtful questions and answer them right here on the site. 

***

How do we cultivate a good balance between thinking for ourselves and building our own systems to suit our unique personalities, and learning from what other people have already discovered about the world and the systems they’ve built and shared?

This is a pretty common question in a lot of fields. Almost anyone who goes deep on trying to study the success and advice of others eventually wonders if they’ll just become a clone of someone else. But the truth of the matter is that most do eventually “find their way” – where everything you’ve learned coalesces into a system of your own. Purely aping someone else doesn’t work very well and is harder than it sounds anyway.

Here’s an exercise for anyone who likes music: Pick a musical artist you like and find out who influenced them. Then listen to those influences. Does your favorite really sound like those influences? Like, really? Almost never.

You might hear an “echo” of Robert Johnson in the Rolling Stones, but the differences between the two are night and day – the difference between country blues and rock ‘n roll!

Yet if you were to ask Keith Richards, he’d tell you the Stones started out basically doing a poor imitation of old American blues artists. But what they really did was take the soul of that music (and, I might add, early rock and rollers like Elvis and Chuck Berry), added their own spice and reality, and created something entirely new. That’s how creativity works. You don’t just create new things out of the clear blue sky – you have to start with something. Making new connections and associations is creativity.

Even Sam Walton used to say that he basically stole all of the ideas that became Wal-Mart. But what other company was really anything like Wal-Mart? It was completely unique. And why should anyone else have been like Wal-Mart – they were missing the key ingredient…Walton himself!

In these stories lies your answer. Cultivating that balance will happen naturally if you simply break down what you learn to its essence and take what is useful from it. You don’t need to outright copy anyone else, and contrary to popular belief, success isn’t simple imitation. It’s learning the principles behind what made others successful, the underlying reality being demonstrated by that success, and incorporating that reality into your worldview.

Farnam Street is about pursuing an understanding of “the way the world works.” As long as you use those systems you learn from others as a way of getting at the underlying reality – going beyond pure imitation — you will have the opportunity to “make them your own.”

Two quotes sum this up:

Take what is useful, discard what is not, add what is specifically your own.
Bruce Lee

Any truth, I maintain, is my own property.
Seneca

When Charlie [Munger] talks about knowledge across a wide range of disciplines, what are those disciplines, and which does he appear to favor?

Charlie address this a little bit in a speech called “A Lesson on Elementary, Worldly Wisdom As It Relates To Investment Management & Business”.

He’s talking about the basic disciplines that would make up a really good broad undergraduate curriculum: Math/Statistics, Physics, Chemistry, Biology, Engineering, Complex Systems, Psychology, Business/Economics, Law, with the more fundamental ones being generally most reliable. (1+1 always seems to come out to 2.)

Charlie seems to have made use of models across all disciplines. He probably uses psychology and biology more than most, which is a great lesson. And clearly he and Buffett have made wise use of probabilistic thinking.

But remember, in his own words, “80 or 90 models carry most of the freight” – in other words, you’re looking for the Big Ideas. Something like compound interest from mathematics or incentives from psychology explain a large fraction of what you see around you. And you always have the ability to generate new models that you think are explanatory, accurate, and memorable — that’s part of the fun.

An accurate and fluent understanding of the big models of the world should be your “first principles” — the large trunk and branches on which all of the “leaves” of your knowledge will hang. Without a big solid trunk with big solid branches, what kind of tree do you expect to have?

From there, it’s about synthesizing across the disciplines — understanding where they overlap, conflict, and combine. What do the models in biology and business have in common? What does the concept of entropy have to do with practical life? Well, a great deal. But you have to reach a bit to figure it all out. And as we talk a lot about here, you eventually find that everything seems to be connected to everything else.

Remember, all models are abstractions of reality. George Box put it that “All models are false. Some are useful.”

Reality itself is simply one continuous, flowing entity, but we as humans have to work with our natural apparatus to understand it. Dividing things into little sub-disciplines is one of the ways we go about doing that. Just remember that your end-goal is to understand reality as best as possible; unfiltered and unadulterated. Any way you decide to organize your search for reality must take into account the way humans learn, but always remember that you’re abstracting reality.

How do you choose what next to read? Do you randomly pick a book off the shelf or do you let what you just read pull you towards something that it referenced so you can go deeper into a topic? Do you just wake up in the morning and say I feel like learning about.. this! and go for it? 

It’s a combination of a lot of things, but basically the underlying principle is always to follow what interests you, right now. We discuss this a few times in our course on reading.

The thing about curiosity, in the words of Nassim Taleb, is that it’s “Antifragile, like an addiction, and is magnified by attempts to satisfy it.” When you go down the curious path on a particular topic, you have to keep letting it pull you down. Don’t just stop because you feel like you should — if you want to keep going, keep going! Learn! Go deep! Trust us on this one: Ride the wave when it’s taking you. It may be a while before you get back up there.

When you decide to get off the path is really going to be an individual judgment, based on how curious you are, how competent you feel you are, and what you plan to do with that information. If you’re going to be a doctor, you have to go “all the way down the path” on the current and most up-to-date understanding of how the human body works, in great detail. Lives depend on it.

But if you’re a lawyer, you might be (rightfully) content to simply try to understand at a high-level how all the main bodily systems work and interact, without being able to do a detailed dissection of the heart. The doctor and the lawyer need not pursue their understanding of human anatomy in anywhere near the same level of detail, but they should both know the Big Ideas. Make sense?

So, long story short, what we’re reading at any given time is simply what currently grabs our curiosity; and there are innumerable ways to get it grabbed. Sometimes we will see a book on the shelf and pull it down, but more frequently it’s connected to something else we’ve read recently and decided to pursue further. Recently we recommended a biography of Will Rogers in Brain Food. Why that one, and why now? Because someone I respect recommended studying his life, and when the book came in, the time “felt right” almost right then and there. (Which is actually unusual — most of our books sit for a while before we read them.)

Did we know much about memory before starting the four-part series? No. But we had studied human personality and social psychology quite a bit, and memory is a logical extension of that. In this case, the book we discussed came straight from the bibliography of another one.

Once your anti-library is sufficiently stocked, finding the next book to read will always be the last of your worries. We always have many “on deck” and recommend you do too.

For the mailbag, this isn’t really a question maybe more of a post request, but I’d love to see a follow up or update on how your media consumption habits have evolved/changed. The post from Shane a few years back is a personal favorite, and something I’ve found myself revisiting often: 

I’m going to go in a slightly different direction than the question you asked, but hang with me.

We’ve been thinking a lot on this recently, with increasing concern that we’re filling our heads with junk. This, we believe, is not only a poor use of our time and causes more mistakes than are necessary but it also reduces our capacity to find the relevant variables in any given situation.

If you think of your mind as a library, three things should concern you.

  1. The information you store in there — its accuracy and relevance;
  2. Your ability to find/retrieve that information on demand; and
  3. Finally your ability to put that information to use when you need it – that is, you want to apply it.

There is no point having a repository of knowledge in your mind if you can’t find and apply its contents (see multiplicative systems).

Let’s talk about the first part today, which is the information you put into your mind.

We feel this is massively misunderstood, resulting in people failing to filter things from entering the “library of the mind.”

If your library is full of crap and falsehoods, you’re going to struggle and spend a lot of time correcting mistakes. You won’t be very productive and you’ll generally muddle through things.

Our minds are like any tool, and needs to be optimized in building this library. Clickbait media is not the stuff we want to put into our mind library. However, this crap is like cocaine — it causes our brains to light up and feel good. The more of it we consume, the more of it we want. It’s a vicious flywheel, like eating sugar.

Our brain isn’t stupid. It doesn’t want this crap, so while it’s giving you a mild dopamine rush, it’s also working very hard to make sure this junk doesn’t make it into your library. This is one reason that people re-read an article and don’t remember having read it. Their brains determined it was trash and subsequently got rid of it rather than storing it.  Sounds good right?

Well, sort of. As hard as our brains work to ensure this crap doesn’t make it into our library, if we keep feeding it junk, we will overwhelm that natural filter. Over days and weeks this isn’t a big problem, but over years and decades it becomes a huge one.

Junk in the library messes with accuracy, relevance, and gets in the way of effective and efficient use our of brains – it causes issues with retrieving and applying. (Which is most often done by our subconscious. Ever had a great idea in the shower, as you were falling asleep, or while driving? Exactly.)

And while we probably agree that the quality of what enters our head matters, it’s easier said than done.

Consider the CEO with 6 layers of management below him. Something that happens “on the ground floor” of the business, say an interaction between a salesperson and a customer, usually goes through six filters. There is almost no way that information is as accurate as it should be for a good decision after all that filtering.

Now, the CEO might recognize this, but then they have to do something psychologically hard, which is basically say to their direct reports, “I’m not sure I got the right information from you.” They have to go out of their way to seek out more detailed, relevant, independent information from the people close to the problem. (A good assistant will do this for you, but in a political organization they will also be hung out to dry by all parties, CEO included.)

So not only do we need to filter, but we need to be aware of what filters our information has already been through.

Let’s hit on one more related thought.

In our search for wisdom and high quality information to put into our library, we often turn to knowledge nuggets called sound-bytes. These deceptive fellows, also called surface knowledge, make us sound clever and feel good about ourselves. They are also easy to add to our “mind library.”

The problem is surface knowledge is blown away easily, like topsoil. However, we reason, most other people are operating on the same level of surface knowledge! So, in a twisted bout of game theory, we are rarely if ever called out on our bullshit.

The result is that this surface, illusory, knowledge is later retrieved and applied when we’re making decisions (again, often driven by the subconscious) in a variety of contexts, with terrible results. As the saying goes, “Garbage-in equals garbage-out.”

If you’re looking for a quick heuristic you can use for information you’re putting into your library, try the two-pronged approach of:

A. Time
B. Detail.

Time meaning – how relevant is this historically? How long will it be accurate — what will it look like in ten minutes, ten months, ten years? If it’s going to change that soon, you can probably filter it out right here.

One way to determine if the information will stand the test of time is by gauging its accuracy by examining the details. Details are so important that Elon Musk uses them to tell if people are lying during interviews. You want to learn from people with a deepaccurate fluency in their area of expertise: One of the ways you can assess that is through the details they provide. Surface skimming articles are sometimes meant to be readable by the lay public, but more frequently it indicates simply that the author only has surface knowledge! 

So be careful. We’d guess that 99.9% of click-bait articles fail both these filters. They’re neither detailed nor lasting in importance.

The good thing is that you can raise your standards over time. One major reason to read documents by people like Richard Feynman or Charlie Munger is that it gets you used to what really clear thought looks like. If you’re reading shallow, quickly irrelevant media all the time, when will you read Feynman?

For now let’s leave it at that – we’ll have more to say on this in the future. It’s important.

So many people always ask what’s the best book for word-for-word wisdom, or spend hours working out the most efficient means of doing something, which is all great, but in the spirit of a Munger-like avoiding of mistakes, I’d like to hear you and Shane answer what you’ve done in the sphere of learning about the world that’s been the biggest waste of time: the least bang for your mental-investment buck?

Interesting question. It’s hard to answer because everything seems to have some value or another – often it’s in the “what not to do” or “what doesn’t work” sphere, but that is still a useful sphere, so it’s not really a waste.

One thing that does come to mind is speed reading. That is a waste of time and totally counter-productive when you get down to it. If anything, we’ve tried to slow down our reading so we can savor and recall more of what we read. Speed reading is a snare and a delusion, and not worth the time.

Woody Allen had it right: “I took a course on speed reading…and was able to read War and Peace in 20 minutes. It’s about Russia.”

***

If you’d like to submit a question for our next Q&A, please send it to us at [email protected] with the title “Ask Farnam Street.” We will choose a group of the most thoughtful questions and answer them right here on the site. Enjoy!