When I was in graduate school I read a lot of what was then new research by Daniel Kahneman and Amos Tversky. I found their documentation of the systematic ways in which people deviate from rational decision-making fascinating and I was searching for a way to apply that to political science questions. In the end, I couldn’t figure out how to build a new theory based on systematic irrationality.
Reading Michael Lewis’ excellent new book, The Undoing Project, about the amazing and eventually problematic collaboration between Kahneman and Tversky brought back a flood of nostalgia but also reminded me of some problems with trying to extend their work. In particular, I was reminded of two things. First, while Kahneman and Tversky are remarkably persuasive in demonstrating how people regularly deviate from rationality, neither I nor others have had much success in building new theories based on systematic irrationality. As it turns out, assuming rationality is clearly an inaccurate description of how people think, but it remains quite useful for building theories that yield accurate predictions. That is, Kahneman and Tversky may have revolutionized social science much less than Lewis suggests.
Second, much of the work that has tried to build on Kahneman and Tversky seems to violate their basic finding that expert judgement is unreliable. The development of behavioral economics and its application to a variety of fields, including education, mostly seems to consist of trying to devise ways to correct the systematic irrationality of others. If low-income students are accepted to college but do not enroll after failing to complete the FAFSA financial aid form, we assume they are behaving counter to their long-term interests and propose interventions to induce them to complete the form and enroll.
As I’ve written elsewhere, this approach has a variety of problems, but the chief of which is that it assumes too much rationality on the part of the social scientist devising the solutions. How do we know that people would be better off if we could nudge them into doing something other than what they had originally decided to do? Just as other people may be systematically irrational, so may the social scientists devising plans for improving other people’s lives. I’m not saying that no interventions are helpful. I’m just saying that we should be extremely cautious and humble when developing plans for how other people should live their lives.
The need for humility among experts and social scientists was a central theme in Kahneman and Tversky’s work. Their approach was not, as one critic accused them, a psychology of stupid people; it was a psychology of all people, including experts and social scientists. In fact, one of their first experiments was to give statisticians problems to see if they would update their priors as if they were Bayesians. As it turns out, even statisticians who you might think would be particularly familiar with Bayes’ Theorem, do not actually think like Bayesians. In subsequent experiments they found that even warning subjects of the systematic irrationality to which they might be prone does not prevent them from being systematically irrational. Greater knowledge and expertise do not prevent us from falling into the same intellectual potholes over and over again.
So Kahneman and Tversky’s research demonstrates that there is no priestly class immune to the shortcomings of others and even foreknowledge and confession of one’s sins of irrationality provide little protection against repeating common errors. And yet, much of behavioral economics seems to pay little heed to this central finding as they move full steam ahead devising solutions for other people’s irrationality. They seem to forget that devising solutions, building models, and testing them all require human judgements which are also prone to systematic error.
In his seminal volume, Thinking, Fast and Slow, Kahneman admits there is no real solution to our tendency to deviate from rationality. Instead, he suggests some habits to check the errors, mostly involving slowing down, being more cautious and self-critical, as well as inviting the criticism of others. Let’s not correct for popular mistakes by installing a technocratic elite because that elite are also prone to common errors.