I’d like to continue my review of Michael Lewis’ new book, The Undoing Project, on the collaboration between Daniel Kahneman and Amos Tversky by speculating about what advice Kahneman and Tversky might offer education foundations.
Foundations have particular challenge in detecting and correcting errors in their own thinking. Because many people want things from foundations, especially their money, foundations frequently are organized as to limit communication with them. They generally don’t open their doors, phone lines, and emails to whoever might want to suggest something or ask something of them for fear that they will be overwhelmed. So they typically hide behind a series of locked doors, don’t make their phone or emails readily available, and insist that all applications follow a specific format, be submitted at a particular time, and be designed to address pre-determined issues.
Insulating themselves from external influence is understandable, but it creates real problems if they ever hope to detect when they are mistaken and need to make changes. The limited communication that does make it through to foundations tends to re-affirm whatever they are already doing. Prospective grantees don’t like to tell foundations that they are mistaken and need to change course because that makes getting funded unlikely. Instead, foundations tend to hear that their farts smell like roses. To make matters worse, many foundations are run in very top-down ways, which discourage questioning and self-criticism.
The Undoing Project presents a very similar situation having to do with airline pilot errors. Lewis describes a case in which a commercial airline was experiencing too many accidents caused by pilot error. The accidents were not fatal, but they were costly and dangerous — things like planes landing at the wrong airport. So the airline approached Amos Tvsersky and asked for help in improving their training so as to minimize these pilot errors. They wanted Tversky to design a pilot training method that would make sure pilots had the information and skills to avoid errors.
Tversky told the airline that they were pursuing the wrong goal. Pilots are going to make errors and no amount of information or training would stop them from committing those mistakes. We often assume that our errors are always caused by ignorance, but Tversky told them this was not true. The deeper problem is that once we have a mental model of the world, we tend to downplay or ignore information that is inconsistent with that model and bolster facts that support our model. If a pilot thinks he is landing at the right airport, he distorts available information to confirm that he is landing in Fort Lauderdale rather than nearby Palm Beach even if that is incorrect. The problem is not a lack of information, but our tendency to fit information into our pre-conceived beliefs.
Tversky’s advice was to change the cockpit culture to facilitate questioning and self-criticism. At the time cockpits were very hierarchical based on the belief that co-pilots needed to implement pilot orders quickly and without question lest the delay and doubt promote indecision and disorder. So the airline implemented Tversky’s suggestions and changed their training to encourage co-pilots to doubt and question and pilots to be more receptive to that criticism. The results was a marked deline in accidents caused by pilot error. Apparently we aren’t very good at detecting our own errors, but we are more likely to do so if others are encouraged to point them out.
So what might Tversky suggest to education foundations? I think he’d recognize that they have exceptional difficulty in detecting their own errors and need intentional, institutional arrangements to address that problem. In particular, he might suggest that they divide their staff into a Team A and Team B. Each team would work on a different theory of change — theories that are not necessarily at odds with each other but are also not identical. For example, one team might focus on promoting school choice and another on promoting test-based accountability. Or one team may promote tax credits and the other ESAs. The idea of dividing staff into somewhat competing teams is that they then have incentives to point out shortcomings in the other team’s approach. Dividing into Team A and Team B could be a useful check on the all-too-common problem of groupthink.
Another potential solution is to hire two or three internal devil’s advocates whose job it is to question the assumptions and evidence believed by foundation staff. To protect those devil’s advocates, it is probably best to have them report directly to the board rather than the people they are questioning.
Whatever the particular arrangements, the point is that education foundations should strive to promote an internal culture of doubt and self-criticism if they wish to catch and correct their own errors and avoid groupthink. One foundation that I think has taken steps in this direction is the Arnold Foundation. They actually hold internal seminars in which they invite outside speakers to come and potentially offer critiques of their work. Neerav Kingsland, who heads education efforts at Arnold, is also especially available on blogs and twitter for critical discussion. I don’t always agree with Neerav but I am impressed by his openness to dissent.
The collaboration between Kahneman and Tversky was itself an example of the importance of engaging in tough criticism within an effort. Like the airline pilots, they developed habits of challenging each other, which made their work together better than it ever could have been individually.