Wednesday, April 25, 2012

The Precautionary Principle and the Iraqi WMD Test

My emeritus colleague, John Perkins, asks a deep question about proposed justifications for a precautionary principle: would they, in early 2003, have provided a basis for the US invasion of Iraq despite, or even because of, uncertainty about the existence of Iraqi weapons of mass destruction?  The stylized decision situation is this: the US has suspicions that horrible chemical or biological weapons are being stockpiled in Iraq, but there is no firm evidence.  Indeed, the likelihood of WMD’s is small, but the negative consequences if they actually exist are severe.  Suppose further that decision-makers are honest (this is a purely hypothetical test) and want to act rationally so as to minimize the harm of either launching or not launching military action.  In other words, this is a make-believe scenario, but one that nevertheless captures an important aspect of the meaning of precaution: if being precautionary in such a situation makes you more likely to want to invade Iraq, you have a problem.

So is there a version of the precautionary principle that passes this test?  Intergenerational equity arguments (irreversibilities justify low or zero discount rates) are at best a wash, since the costs of under- or overestimating the likelihood of WMD’s have approximately the same time profile.  (The main problem with intergenerational equity is that, while it is a fine concept, it has little relevance to most situations that might require precaution; precaution is about coping with uncertainty, not valuing immediacy versus delay.)  Fat tail aversion à la Weitzman would seem to fail the test, since it would place greater value on insuring against catastrophic WMD risk.  According to this principle, it’s better to accept the certain devil we do know (invasion) than run the risk of the less likely but even worse devil we don’t.  You could argue for a different type of precaution: don’t mess with nature.  This would avoid WMD dilemmas by defining precaution as being about only environmental questions, but at the cost of being either grossly impractical or incoherent.  Example: agriculture, even the most organic kind, is absolutely messing with nature, as are many of the other essential practices of the human race.  Green-is-good is an attitude, not a rational basis for a decision principle.

I think my version of precaution does pass the test.  To recap (OK, not “re” for most readers), I propose that metadata—the history of how we have learned in the past—is relevant to evaluating our ignorance in the present.  If a company had a record of underperforming its earnings target quarter after quarter, you would take this into account even if you had no current information regarding the likelihood of its meeting its next target.  Similarly, what distinguishes the emergence of ecological understanding over the past century or so is that we systematically discover that species, including our own, are more interdependent than we thought, and more sensitive to alterations in their natural environment at lower exposure thresholds.  It is rational to expect that the larger part of our current uncertainty regarding environmental impacts will resolve itself in similar ways in the future; hence precaution.

Here is why I think it passes Perkins’ test.  On the one hand, there has been a long series of manipulated intelligence reports used to justify policies favored by those in power in Washington; foreign threats usually turn out to be less threatening than initially reported.  (Intelligence pertaining to Japan pre-1941 might be an exception, maybe.)  On the other, invasions of foreign countries have typically turned out worse than expected: more resistance, more repression in response to resistance, more cruelty, more overall economic and human cost.  On both counts the metadata should be incorporated into the decision process, and both counsel precaution as I understand it.

My golden rule of precaution: make the decision today that, based on everything you know up to this point, you will be most likely to have wished you had made in the future, when you will have more information.  Assess the likely bias of your ignorance.


Unknown said...

Very thought provoking post!

In thinking about your golden rule of precaution and the WMD example, how does this theory interact with known ideas of risk aversion? It seems conceivable that an individual, aware of the bias towards threats and war, might prefer maintaining that bias to the risk of overcompensating in the other direction. In other words, even if the individual expects future information to confirm their bias, they will still prefer to have made the same decision.

Barkley Rosser said...

Well, Peter, despite your account of "horrifying" chemical and biological weapons, it is really a joke to include them in the scary rubric of "Weapons of Mass Destruction." While bio might potentially be (although see the flop the anthrax attack in the wake of 9/11 was), chemical is clearly not. They are nasty weapons, but they are very local in their impact.

Indeed, although Saddam turned out not to have any, most people (myself included) were reasonably certain that he was still hanging onto some, although not accumulating any fresh ones (there was no evidence of that). There were rumbles about some possible bio labs, but no evidence of actual bio weapons. Certainly even if he still had a pile of chemical weapons these were no threat to anybody but immediate neighbors or citizens, such as Iran and the Kurds, both of whom he had previously used them.

The real kicker was the threat of nukes. But we now know that the data on that scare was totally faked, and pushed hard by people like Cheney. There really was very little evidence of anything on that front, not enough even to invoke the precautionary principle. That was a fraud from the start, although those are the real WMDs, and it was the scary talk about them that lay behind public support for the invastion, not the presumed piles of chem weapons.