Sorry about the header, but there’s been renewed interest in the pioneering work of Stafford Beer in Allende’s Chile, first as a result of Eden Medina’s Cybernetic Revolutionaries (2011) and now the article by Evgeny Morozov in the current New Yorker. Beer was a management specialist who applied cybernetic principles to business organization. He was brought to Chile to design a cybernetic planning system for the entire economy, Cybersyn, which died stillborn when Allende was overthrown in a coup in 1973. Undoubtedly, this is one of the great might-have-beens in the twentieth century: what could Beer have built if he had been given enough time and resources?
I have more than a passing interest in this topic. I regularly teach a course called Alternatives to Capitalism, and I used the Medina book in the most recent iteration. This fall I am working on a paper (as a co-author) that brings together Beer’s “viable model” of the firm with economic analysis, with a focus on the determinants of worker autonomy. I’ve been imbibing Beer regularly for some time.
There is a lot to say about the new round of Beer enthusiasm. I don’t want to get into a pissing contest (where are these puns coming from?), but I believe Morozov is quite wrong to identify Beer with the Soviet cyberneticians depicted, for example, in Francis Spufford’s marvelous Red Plenty. The Soviet reformers wanted to use computers to calculate efficient planning prices; there were no prices in Beer’s model. There was no bottom-up reversal of authority in the Soviet vision, not even in theory, whereas that was intended to be the distinctive aspect of Cybersyn, the feature that would make it “really” socialist. While computers played a role in both approaches, it’s a mistake to put too much weight on them, big and heavy as they were back then. Beer, after all, had made his reputation with minimum knowledge or use of computers; his cybernetics was organizational and conceptual.
But I don’t think that the actual potential of Cybersyn matched Beer’s vision for it, and its shortcomings even during the limited period in which it was in partial operation bear this out. Beer’s critics were right, in fact: this really was a project whose result could only be to intensify centralized control over decisions at lower levels—the computer as Big Brother. Production systems were taken as given at the enterprise level, and the only questions were those asked in operations research: how much should we dial up this process or dial down that one? People were simply instruments in this framework; they had no ability to change the questions that were being asked.
From an economic point of view, I’m afraid Beer did not rise to the Hayek challenge. Cybersyn processed information about the throughput of materials and products far more efficiently than the Hayek of 1937 could have imagined, but it left unexamined the problem of how information is ultimately generated. An internet of things can tell you what materials are going where, but it can’t identify promising innovations in production systems or tell you which innovations should be replicated and which discarded. Worse, it has no way to assess the quality of what’s being produced, since it is primarily consumers who need to be able to decide this. Hayek is surely right that what we would now call parallel processing is needed to implement trial-and-error methods in real time, and there need to be incentives for improved production methods and higher quality. Hayek, non-Walrasian that he was, would probably say, and I would agree with him, that Beer’s model works well within firms but not between them, since coordination is not the primary problem that economies, rather than firms, need to solve. (The deep problem is coordinate to do what and in what way?)
I’m compressing a much more detailed argument and should probably stop here. None of this, incidentally, has to do with the paper I’m writing, since that one is about the theory of the firm. I should also add that Beer’s ideas are valuable and can be incorporated into a better model of economic planning, just not the way he went about it. The guy was brilliant but he didn’t know much economics.
UPDATE: Here are two more thoughts about Cybersyn.
A. For Beeristas, it should be disturbing that his Chilean model lacked a System II, in this case meaning there was no provision for horizontal communication between firms. All information flowed up and down, passing through the center. In fact, it was all System III—with no apparent Systems IV or V. System III is the element of command-based hierarchy.
B. Mechanical application of the viable systems model to whole economies is a dubious enterprise. The clearest evidence for this is the large role that markets play at present. Markets do not exemplify any of Beer’s systems beyond System I (direct activity of the units); they operate on a different basis. This doesn’t mean that markets are perfect or that planning is impossible, only that before you start postulating how economies need to be organized you ought to take a close look at how markets do this. Specifically, as I tried to explain above, markets accomplish several functions that are necessary to a modern economy but are not addressed by Cybersyn. Does this imply a division of labor? What division?
To put it in Beerian terms, Cybersyn is not an economic brain. What it approximates is the autonomic nervous system, in the sense that János Kornai and Béla Martos described it in Autonomous Control of the Economic System. It’s fine for a paramecium but rather limited for a human.
5 comments:
", since coordination is not the primary problem that economies, rather than firms, need to solve."
I'm not quite getting this claim. Given an extensive division-of-labor with a high degree of technical specialization, the economy breaks down into n sectors, each of which produces for each other "before" it produces any final consumption product. That's a reproduction scheme, which must be maintained in a balanced way, as a constraint of the production system, (which may or may not be market-mediated), if the economy is to maintain itself, let alone expand. When you add in underlying technical change, within and across cycles, then the ratios of any reproduction scheme are changing, resulting in investment realization failures, the loss of balanced expansion, and periodic crises. SO the "coordination problem" seems to me to be the central problem of economics, regardless of whether it is to be solved by markets, state planning or some mix of the two.
Yes, given a set of primary inputs, an input-output matrix and known final demand, it is entirely a coordination problem. And I agree that coordination is complex and consequential. Nevertheless, there are several instruments that can do a reasonable job of this: it's a problem we know how to solve.
The deeper question is what goods and services, with what qualities, should be produced and in what manner? Our knowledge of this at any given time is limited and conditional. There is a need for continuous trial and error, trying out new products and methods, testing them against demand. Without this, a well-coordinated economy can become drastically unproductive. The Soviet economy collapsed not so much because they got the proportions wrong, but because their goods were shoddy and their methods backward. Think about the Lada, which even had an infusion of foreign technology. (The fundamental problem with the Lada was not that there weren't enough of them.)
When people talk about quality, by the way, it can sound Yuppie-ish: shinier, fancier, etc. But commodities go into producing commodities, and what defines the quality of an input is the process that requires it. Innovating in production is centrally about redefining what counts as input quality and eliciting this from suppliers.
Read the Kornai article I linked to. He describes an economy that succeeds, through continuous shortage and surplus feedback mechanisms, in achieving balanced output as "vegetative". I think he nails it.
A thermostat controlling a furnace controlling the temperature in a house was traditionally given as an illustration of cybernetic control: control by feedback. The mechanism inside the thermostat turns on the furnace, when the temperature falls below a certain set point, and turns off the furnace when temperature rises above a certain set point. If the temperature outside the house is always colder than the set point but not overwhelmingly cold, and the house remains reasonably well insulated, and the furnace has sufficient fuel and remains operational, the system will keep the temperature inside the house -- as measured by the thermostat -- within some range, near the set point.
There are a number of points that can be usefully highlighted about the cybernetic model of control from this illustration. One of the most fundamental is that there are two, distinct feedback loops, with quite different character. One is between the thermostat's set point and the operation of the furnace. That's the technical one, if you like a label. The other involves the person setting the thermostat, choosing the set point. That's the value loop, to apply a contrasting label.
It is a hugely important distinction: it is the difference between steering a car (using the steering wheel and associated mechanism), and driving a car (to a purposefully chosen destination).
With sufficient computing power, we can imagine something like Google's "self-driving" car, and that raises important issues of its own, but it doesn't erase the distinction. What "automation" carried to the extreme of a self-driving car highlights is the extent to which the advances of the industrial revolution have depended on the ability to strategically substitute human craft for the shortcomings of whatever "system" of "automatic" control can be devised (within the limits of scientific knowledge). And, of course, what is usually thought of as the dominant pattern of innovative capital investment, the reverse: substituting cybernetic systems for the shortcomings of craft in production of goods and services.
Human craft (at base, the huge and remarkably plastic human cerebellum supplying fine motor skills and "muscle" memory: that ability to learn to ride a bicycle, which is unconscious and, proverbially, never forgotten) blends together the technical loop and the value loop.
Increasing the autonomy of systems doesn't do away with the value loop. Taste still matters. Probably even to a paramecium.
Nice comment, Bruce. The point to bear in mind when we apply this distinction to the economy is that the value element is not some overarching sense of direction (driving) or thermostat setting; it is diffused within all the activities that take place in the economic realm. This is why economic planning is much, much harder, even conceptually, than most of the common metaphors allow.
Hi Professor Dorman, did you ever get a chance to work on that paper? I searched for it and couldn’t find anything.
Post a Comment