US Secretary of Defense has received general derision for the following rather convoluted statement
Reports that say that something hasn’t happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns – the ones we don’t know we don’t know
As I’m giving two papers on this general topic in the next couple of days, I feel I should come to his defense on this. Although the language may be tortured, the basic point is both valid and important.
The standard planning procedures recommended in decision theory begin with the assumption that the decisionmaker has foreseen every relevant contingency. Given this assumption, making the right decision is a simple matter of attaching probabilities (or, if you like my rank-dependent generalization of the standard model, decision weights) to each of the contingencies, attaching benefit numbers (utilities) to the contingent outcomes that will arise from a given course of action, then taking a weighted average. Whatever course of outcome yields the best average outcome is the right one to take. In this way, uncertainty about the future can be ‘domesticated’ and reduced to certainty equivalents.
The problem is that, in reality, you can’t foresee all possible contingencies = the ‘unknown unknowns’ Rumsfeld is talking about are precisely these unforeseen contingencies. Some of the time this doesn’t matter. If the unforeseen contingencies tend to cancel each other out, then the course of action recommended by standard decision theory will usually be a pretty good one. But in many contexts, surprises are almost certain to be unpleasant. In such contexts, it’s wise to avoid actions that are optimal for the contingencies under consideration, but are likely to be derailed by anything out of the ordinary. There’s a whole literature on robust decision theory that’s relevant here.
Having defended Rumsfeld, I’d point out that the considerations he refers to provide the case for being very cautious in going to war. Experience shows that decisions to go to war, taken on the basis of careful calculation of the foreseeable consequences, have turned out badly more often than not, and disastrously badly on many occasions. The calculations of the German military leading up to World War I, including the formulation of the Schleiffen plan, provide an ideal example.
Finally, I should mention that I saw a link at the time to a post somewhere that seemed, from the one sentence summary to be making a similar point, but I was too busy too follow it, and can’t now locate it. Anyone who can find it for me gets a free mention in the update.
UpdateAt least one such post has come to my attention, at Language Log, along with a useful link to Sylvain Bromberger who has, it seems, written extensively on the theory of ignorance. I will be keen to chase this up.