• Skip to main content
  • Skip to header right navigation
  • Skip to site footer
Farnam Street Logo

Farnam Street

Mastering the best of what other people have already figured out

  • Articles
  • Newsletter
  • Podcast
  • Books
  • Courses
  • Log In
  • Become a Member
TweetEmailLinkedInPrint
Thinking|Reading Time: 5 minutes

Daniel Kahneman Explains The Machinery of Thought


Israeli-American psychologist and Nobel Laureate Daniel Kahneman is the founding father of modern behavioral economics. His work has influenced how we see thinking, decisions, risk, and even happiness.

In Thinking, Fast and Slow, his “intellectual memoir,” he shows us in his own words some of his enormous body of work.

Part of that body includes a description of the “machinery of … thought,” which divides the brain into two agents, called System 1 and System 2, which “respectively produce fast and slow thinking.” For our purposes, these can also be thought of as intuitive and deliberate thought.

The Two Systems of Thinking

Psychologists have been intensely interested for several decades in the two modes of thinking evoked by the picture of the angry woman and by the multiplication problem, and have offered many labels for them. I adopt terms originally proposed by the psychologists Keith Stanovich and Richard West, and will refer to two systems in the mind, System 1 and System 2.

  • System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control.
  • System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.

If asked to pick which thinker we are, we pick system 2. However, as Kahneman points out:

The automatic operations of System 1 generate surprisingly complex patterns of ideas, but only the slower System 2 can construct thoughts in an orderly series of steps . I also describe circumstances in which System 2 takes over, overruling the freewheeling impulses and associations of System 1. You will be invited to think of the two systems as agents with their individual abilities, limitations, and functions.

System One: Quick

These vary by individual and are often “innate skills that we share with other animals.”

We are born prepared to perceive the world around us, recognize objects, orient attention, avoid losses, and fear spiders. Other mental activities become fast and automatic through prolonged practice. System 1 has learned associations between ideas (the capital of France?); it has also learned skills such as reading and understanding nuances of social situations. Some skills, such as finding strong chess moves, are acquired only by specialized experts. Others are widely shared. Detecting the similarity of a personality sketch to an occupational stereotype requires broad knowledge of the language and the culture, which most of us possess. The knowledge is stored in memory and accessed without intention and without effort.

System Two: Deliberate

This is when we do something that does not come naturally and requires some sort of continuous exertion.

In all these situations you must pay attention, and you will perform less well, or not at all, if you are not ready or if your attention is directed inappropriately.

Paying attention is not really the answer as that is mentally expensive and can make people “effectively blind, even to stimuli that normally attract attention.” This is the point of Christopher Chabris and Daniel Simons in their book The Invisible Gorilla. Not only are we blind to what is plainly obvious when someone points it out but we fail to see that we are blind in the first place.

The Division of Labour Between System One and Two

Systems 1 and 2 are both active whenever we are awake. System 1 runs automatically and System 2 is normally in a comfortable low-effort mode, in which only a fraction of its capacity is engaged. System 1 continuously generates suggestions for System 2: impressions, intuitions, intentions, and feelings. If endorsed by System 2, impressions and intuitions turn into beliefs, and impulses turn into voluntary actions. When all goes smoothly, which is most of the time, System 2 adopts the suggestions of System 1 with little or no modification. You generally believe your impressions and act on your desires, and that is fine— usually.

When System 1 runs into difficulty, it calls on System 2 to support more detailed and specific processing that may solve the problem of the moment. System 2 is mobilized when a question arises for which System 1 does not offer an answer, as probably happened to you when you encountered the multiplication problem 17 × 24. You can also feel a surge of conscious attention whenever you are surprised. System 2 is activated when an event is detected that violates the model of the world that System 1 maintains. In that world, lamps do not jump, cats do not bark, and gorillas do not cross basketball courts. The gorilla experiment demonstrates that some attention is needed for the surprising stimulus to be detected. Surprise then activates and orients your attention: you will stare, and you will search your memory for a story that makes sense of the surprising event. System 2 is also credited with the continuous monitoring of your own behavior—the control that keeps you polite when you are angry, and alert when you are driving at night. System 2 is mobilized to increased effort when it detects an error about to be made. Remember a time when you almost blurted out an offensive remark and note how hard you worked to restore control. In summary, most of what you (your System 2) think and do originates in your System 1, but System 2 takes over when things get difficult, and it normally has the last word.

The division of labor between System 1 and System 2 is highly efficient: it minimizes effort and optimizes performance. The arrangement works well most of the time because System 1 is generally very good at what it does: its models of familiar situations are accurate, its short-term predictions are usually accurate as well, and its initial reactions to challenges are swift and generally appropriate. System 1 has biases, however, systematic errors that it is prone to make in specified circumstances. As we shall see, it sometimes answers easier questions than the one it was asked, and it has little understanding of logic and statistics. One further limitation of System 1 is that it cannot be turned off.

[…]

Conflict between an automatic reaction and an intention to control it is common in our lives. We are all familiar with the experience of trying not to stare at the oddly dressed couple at the neighboring table in a restaurant. We also know what it is like to force our attention on a boring book, when we constantly find ourselves returning to the point at which the reading lost its meaning. Where winters are hard, many drivers have memories of their car skidding out of control on the ice and of the struggle to follow well-rehearsed instructions that negate what they would naturally do: “Steer into the skid, and whatever you do, do not touch the brakes!” And every human being has had the experience of not telling someone to go to hell. One of the tasks of System 2 is to overcome the impulses of System 1. In other words, System 2 is in charge of self-control.

[…]

The question that is most often asked about cognitive illusions is whether they can be overcome. The message of these examples is not encouraging. Because System 1 operates automatically and cannot be turned off at will, errors of intuitive thought are often difficult to prevent. Biases cannot always be avoided, because System 2 may have no clue to the error. Even when cues to likely errors are available, errors can be prevented only by the enhanced monitoring and effortful activity of System 2. As a way to live your life, however, continuous vigilance is not necessarily good, and it is certainly impractical. Constantly questioning our own thinking would be impossibly tedious, and System 2 is much too slow and inefficient to serve as a substitute for System 1 in making routine decisions. The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high. The premise of this book is that it is easier to recognize other people’s mistakes than our own.

Still Curious? Thinking, Fast and Slow is a tour-de-force when it comes to thinking.

Read Next

Next Post:Paying for OutcomesIconic typography designer Paula Scher discusses her creative process, including the famous Citi logo. Interestingly, the idea came to her …

Discover What You’re Missing

Get the weekly email full of actionable ideas and insights you can use at work and home.


As seen on:

Forbes logo
New York Times logo
Wall Street Journal logo
The Economist logo
Financial Times logo
Farnam Street Logo

© 2023 Farnam Street Media Inc. All Rights Reserved.
Proudly powered by WordPress. Hosted by Pressable. See our Privacy Policy.

  • Speaking
  • Sponsorship
  • About
  • Support
  • Education

We’re Syrus Partners.
We buy amazing businesses.


Farnam Street participates in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising commissions by linking to Amazon.