Monday, June 2, 2008

A Challenge on CGE Modeling

I’m currently working with Sightline Institute in Seattle, monitoring the economic analysis phase of the Western Climate Initiative. WCI, which consists of seven US states and three Canadian provinces, is cooking up a common carbon emissions plan, and a consulting firm has been brought onboard to help clarify the economic implications of different policy alternatives. There are many issues specific to this project I may return to later, but for now I’d like to put the spotlight on the methodology WCI will be relying on, CGE modeling. I think these models are so dubious theoretically and unreliable in practice that there is no case for using them. In particular, I am issuing a challenge to their defenders: if no one can answer it, the case is closed.



On a theoretical level, it is surprising that CGE modeling has become such a vibrant industry, since its underpinnings in general equilibrium theory have been systematically undermined over the past several decades. (1) CGE models use the technique of representative agents—vast numbers of households and firms are treated as if they were a single decision-making entity—when we now know that multiple agents cannot be modeled as if they were just one. (2) In particular, the Debreu-Sonnenschein-Mantel result demonstrates that full knowledge of all supply and demand relationships in an economy is not sufficient to predict the equilibrium the economy will arrive at when it is not there yet. (3) The behavioral assumptions of these models, typically resting on utility maximization or simple modifications of it, have been empirically falsified. (4) Production and utility functions are routinely chosen for their convexity properties, despite the widespread recognition that nonconvexities (that yield multiple equilibria) are rife. In short, if theory should inform practice, we shouldn’t be doing CGE.

Now for the challenge. As far as I know, there has never been a rigorous ex post evaluation of CGE models in practice, one that compares predicted to actual outcomes. Based on performance, is there any evidence that such models add value—that their predictions are any better than those derived from macro or sector-specific models, or even a random walk? Also, are CGE models employed by any private sector players who bet real money on the results, or is it only in academia and the public sector that CGE modeling is taken seriously?

My challenge is for those who think there is anything to CGE to come up with evidence that their forecasts add value—either careful retrospective analysis, market applications or both. If not, after more than three decades of experience to go on, why shouldn’t we draw the conclusion that this is a self-perpetuating enterprise promulgated by specialists who sell not an improved ability to make forecasts, but a patina of high-tech respectability for agencies with no stake in whether their policies will actually perform?

8 comments:

  1. CGE models clearly have lots of problems. The question becomes for people dealing with policy: what is the alternative?

    [Note: I do not have an answer to that, just asking.]

    Barkley

    ReplyDelete
  2. But Barkley, if that is to be treated as a real question and not just a rhetorical one, there has to be an appeal to evidence. To take macro models, for example, there have been many closely analyzed retrospective studies, and billions of dollars are positioned each day based on the versions used by banks, funds, etc. We have performance data. What performance data do we have for CGE models?

    ReplyDelete
  3. Peter,

    I do not know the answer to your question. I know that there are a lot of these CGE models around and that they are being used for quite a few things.

    The main time that I saw a lot of publicity about their results, and, if I am remembering accurately there were competing ones coming out with competing results, was during the NAFTA debate, where CGEs were being used to make cases in various directions.

    Again, I do not have a nice source and this is more an impressionistic memory, the most widely publicized one came out with a pretty strongly pro-NAFTA outcome, that it would substantially increase employmentin both the US and Mexico. As it is, the effects were much weaker than what that model showed, although the models were closer to realithy for the US than the "giant sucking sound" forecast by Ross Perot.

    The big negative surprise that I do not think managed to get into any of the models was the impact on the Mexican corn industry of our high subsides for corn, kept in place while they eliminated theirs, which had a large negative impact on industrial wages in Mexico as lots of people got pushed off the Mexican corn farms.

    That last sort of drags things off to a specific case, and again I think there were competing models, but I am not aware of anybody publishing a study checking them out and how they did.

    Barkley

    ReplyDelete
  4. I wonder what CGE model was used to assess the effect of Chapter 11 of the North American Free Trade Agreement on the environment?

    ""Chapter 11".. allows corporations to sue federal governments in the NAFTA region if they feel a regulation or government decision adversely affects their investment. It is argued this provision scares the government from passing environmental regulation because of possible threats from an international business. For example Methanex, a Canadian corporation, filed a $970 million suit against the United States, claiming that a Californian ban on MTBE, a substance that had found its way into many wells in the state, was hurtful to the corporation's sales of methanol. In another case Metalclad, an American corporation, was awarded $16.5 million from Mexico after the latter passed regulations banning the toxic waste dump it intended to construct in El Llano, Aguascalientes. Further, it has been argued that the provision benefits the interests of Canadian and American corporations disproportionately more than Mexican businesses, which often lack the resources to pursue a suit against the much wealthier states.
    http://www.vancol.com/history-of-nafta.cfm

    In January 2002 a survey of satellite images found that Mexico lost almost 3 million acres of forest and jungle each year between 1993 and 2000 - nearly twice what officials had previously estimated.
    The Montes Azules Biosphere and the Lacandon rain forest in the state of Chiapas are in most critical danger.

    I wonder how a CGE model can work in assessing the economic effect of the removal of 20,000 street vendors by 1,200 police in Mexico in 2007?

    Mexico City Wins Battle in War Against Street Vendors (Update1)
    By Patrick Harrington. 12th October 2007


    In any case Milton Friedman had a solution to the problem of pollution from the activities of energy corporations. He said that "private [energy] enterprises will bear all the cost only if they are required to pay for environmental damage. The way to do that is to impose effluent charges..."

    [Free to Choose by Milton and Rose Friedman. Page 221]

    Oh, wouldn't it have been great if Milton Friedman had actively campaigned as an environmentalist to ensure that these 'effluent charges' were (i)implemented (ii) set at an appropriate level to allow for the expansion of much cleaner and renewable energies (iii) transferred an appropriate amount to the victims of pollution.

    but such is the simple world of market fundamentalists.

    PS: with respect to lots of people in Mexico being pushed off corn farms, how much of this had to do with the change to Article 27 of the Mexican Constitution that freed up the land market there for 'free' trade. Agricultural and forested lands became commodified.

    Under the new law, individual ejido farmers will be given title to their land. They also will be able to sell it, rent it to other ejido associations or private corporations, use the land as collateral for loans, and pass the land on to their heirs. Limits on acreage have been relaxed. Formerly, individual ejido farmers were not allowed to "own" more than 247 acres of irrigated land, 496 acres of non- irrigated land, or 1,976 acres of forests. The new law allows the creation of corporations or associations that legally can own 25 times that limit: 6,175 irrigated acres, 12,350 nonirrigated acres, or 49,400 acres of forest...

    Salinas Prepares Mexican Agriculture for Free Trade
    by Wesley R. Smith. Backgrounder #914. October 1, 1992
    http://www.heritage.org/Research/TradeandForeignAid/BG914.cfm

    Is it any wonder there was such a mass movement of people from the countyside to the city and the native forests became so devasted.
    It is quite possible that the subesequent scramble for credit to purchase all of this newly available commodified land led to the Mexican peso crisis a couple of years later.

    ReplyDelete
  5. I agree! The current models are working backwards from observables rather than attacking the problem from scratch.

    I suggest that we need a new fundamental hypothesis about human behavior, from which a more accurate model of dynamics can be built.

    Why not "the fundamental hypothesis of periodicity" -- i.e., that humans exhibit patterns of consumption and production that repeat in time. From this, one can construct frequency distributions, which integrate into cumulative distribution functions (a/k/a aggregate supply and demand).

    They would also provide a nice, measurable target against which any model of the underlying dynamics could be measured against.

    ReplyDelete
  6. Good question! In Hawaii, three principal forecasters annually compete on results versus actual, as I've blogged on SusHI.
    Looking back of the last 5 years, underwhelming results abound!
    Of course, CGE and I/O models are being used out here for lots of climate and sustainability-related policy studies...with no apparent concern for whether they 'add value'.

    ReplyDelete
  7. I believe that the intellectual history of CGE modelling can be traced back to Koopmans structural estimation theories presented in the 1940s.

    Koopmans's colleague Harold T. Davis was working on another approach to forecasting, which relied upon the Fourier analysis of time-series. The approach was dropped in favor of structural estimation methods after those took off.

    But some fruitful work might result from a revisitation of Harold Davis's work, especially with an eye toward measurable probability distributions in the frequency of patterns of consumption and production.

    ReplyDelete
  8. See Tim Kehoe:
    An evaluation of the performance of applied general equilibrium models of the impact of NAFTA

    This paper evaluates the performances of three of the most prominent multisectoral static applied general equilibrium models used to predict the impact of the North American Free Trade Agreement. These models drastically underestimated the impact of NAFTA on North American trade. Furthermore, the models failed to capture much of the relative impacts on different sectors. Ex-post performance evaluations of applied GE models are essential if policymakers are to have confidence in the results produced by these models. Such valuations also help make applied GE analysis a scientific discipline in which there are well-defined puzzles with clear successes and failures for competing theories. Analyzing sectoral trade data indicates the need for a new theoretical mechanism that generates large increases in trade in product categories with little or no previous trade. To capture changes in macroeconomic aggregates, the models need to be able to capture changes in productivity.

    ReplyDelete

Spam and gaslight comments will be deleted.