Daniel Kahneman on the Definition of Rationality and the Difference Between Information and Insight

Lance Workman interviews Daniel Kahneman, Nobel laureate, co-creator of behavioral economics, and author of Thinking, Fast and Slow. Among other things Kahneman discusses the definition of rationality, the dual process of thought, the difference between insight and information, and changing our minds.

You’ve demonstrated that people are rather poor at making decisions that involve some degree of uncertainty – and yet you don’t see people as irrational?

Well, I think the whole issue of whether people are rational or irrational depends on your definition of rationality. What you find is that there is a definition of rationality that is accepted in economics and if you stick to that definition then people are definitely not rational – it’s all about economic decision-making. Of course that does not mean that they are crazy, as this is quite different to what being rational means in everyday language.

One way out of this ‘why do we make bad decisions under some circumstances?’ debate is by introducing a dual-process model. This is based on work I did with Shane Frederick, and the model assumes that there are two ways in which decisions are produced. System 1 is very rapid, automatic, effortless and intuitive. System 2 is slower, rule-governed, deliberate and effortful. System 2 sometimes intervenes on behalf of System 1 as it ‘knows’ the latter is prone to violate certain rules. This means that we are likely to make errors when System 2 fails to correct System 1. That’s when we appear to act irrationally at times. We have a very rational system available –but it isn’t always engaged.

You have done a lot of work on cognitive illusions. One of the terms you coined in this area is the ‘illusion of validity’. How did you come up with this concept?

I coined the term ‘the illusion of validity’ when I was 20, but it didn’t make it into the literature until 1973 when we published a paper on the psychology of prediction.

It’s really about the fact that when we are making predictions often there is no real connection between statistical information and our experiences of insight. We think there should be a close connection between these two forms of evidence – but often there isn’t.

One of your more recent areas of interest is something you call adversarial collaboration – it sounds intriguing. What does it involve?

I think that there is a lot of controversy in psychology – perhaps not as much as in some other fields – but in general by the time you get into the cycle of ‘critique’ followed by ‘replies’ and ‘rejoinders’ in journals, you are really wasting your time and energy. People get intensely involved in them. They become emotionally wound up – but usually you find it yields absolutely nothing, with nether side changing their minds about anything or agreeing that they got anything wrong. And so I would very much like to see the reply and rejoinder way of dealing with debates disappear.

I think that it could happen.