Wednesday, April 25, 2012
The Precautionary Principle and the Iraqi WMD Test
My emeritus colleague, John Perkins, asks a deep question about proposed justifications for a precautionary principle: would they, in early 2003, have provided a basis for the US invasion of Iraq despite, or even because of, uncertainty about the existence of Iraqi weapons of mass destruction? The stylized decision situation is this: the US has suspicions that horrible chemical or biological weapons are being stockpiled in Iraq, but there is no firm evidence. Indeed, the likelihood of WMD’s is small, but the negative consequences if they actually exist are severe. Suppose further that decision-makers are honest (this is a purely hypothetical test) and want to act rationally so as to minimize the harm of either launching or not launching military action. In other words, this is a make-believe scenario, but one that nevertheless captures an important aspect of the meaning of precaution: if being precautionary in such a situation makes you more likely to want to invade Iraq, you have a problem.
So is there a version of the precautionary principle that passes this test? Intergenerational equity arguments (irreversibilities justify low or zero discount rates) are at best a wash, since the costs of under- or overestimating the likelihood of WMD’s have approximately the same time profile. (The main problem with intergenerational equity is that, while it is a fine concept, it has little relevance to most situations that might require precaution; precaution is about coping with uncertainty, not valuing immediacy versus delay.) Fat tail aversion à la Weitzman would seem to fail the test, since it would place greater value on insuring against catastrophic WMD risk. According to this principle, it’s better to accept the certain devil we do know (invasion) than run the risk of the less likely but even worse devil we don’t. You could argue for a different type of precaution: don’t mess with nature. This would avoid WMD dilemmas by defining precaution as being about only environmental questions, but at the cost of being either grossly impractical or incoherent. Example: agriculture, even the most organic kind, is absolutely messing with nature, as are many of the other essential practices of the human race. Green-is-good is an attitude, not a rational basis for a decision principle.
I think my version of precaution does pass the test. To recap (OK, not “re” for most readers), I propose that metadata—the history of how we have learned in the past—is relevant to evaluating our ignorance in the present. If a company had a record of underperforming its earnings target quarter after quarter, you would take this into account even if you had no current information regarding the likelihood of its meeting its next target. Similarly, what distinguishes the emergence of ecological understanding over the past century or so is that we systematically discover that species, including our own, are more interdependent than we thought, and more sensitive to alterations in their natural environment at lower exposure thresholds. It is rational to expect that the larger part of our current uncertainty regarding environmental impacts will resolve itself in similar ways in the future; hence precaution.
Here is why I think it passes Perkins’ test. On the one hand, there has been a long series of manipulated intelligence reports used to justify policies favored by those in power in Washington; foreign threats usually turn out to be less threatening than initially reported. (Intelligence pertaining to Japan pre-1941 might be an exception, maybe.) On the other, invasions of foreign countries have typically turned out worse than expected: more resistance, more repression in response to resistance, more cruelty, more overall economic and human cost. On both counts the metadata should be incorporated into the decision process, and both counsel precaution as I understand it.
My golden rule of precaution: make the decision today that, based on everything you know up to this point, you will be most likely to have wished you had made in the future, when you will have more information. Assess the likely bias of your ignorance.