## Friday, January 17, 2014

### "The long-debunked fallacy known as Say’s Law..." -- a perplexing guide to the perplexed

"You want bread? Bang a baker!"

Paul Krugman wonders why don't da bunk stay debunked? He should ask a falleontologist (rhymes with paleontologist). When I was at Cornell, many decades ago, I came across an article and then a doctoral dissertation by Daniel Ellsberg that demonstrated the difference between risk and ambiguity. More recently I encountered a lovely explanation by Jeff Gill of the invalidity of the "probabilistic modus tollens" ("Ho" signifies the null hypothesis):
The basis of the null hypothesis significance test rests on the logical argument of modus tollens (denying the consequent). The basic strategy is to make an assumption, observe some real-world event, and then check the consistency of the assumption given this observation. The syllogism works like this:
If A then B | If Ho is true then the data will follow an expected pattern
Not B observed |The data do not follow the expected pattern
Therefore not A | Therefore Ho is false.
The problem with the application of this logic to hypothesis testing is that the certainty statements above are replaced with probabilistic statements, causing the logic of modus tollens to fail. To see this, reword the logic above in the following way:
If A then B is highly likely | If Ho is true then the data are highly likely to follow
an expected pattern
Not B observed |The data do not follow the expected pattern
Therefore A is highly unlikely |Therefore Ho is highly unlikely
Initially, this logic seems plausible. However, it is a fallacy to assert that obtaining data that is atypical under a given assumption implies that the assumption is likely false: almost a contradiction of the null hypothesis does not imply that the null hypothesis is almost false (Falk and Greenbaum 1955). For example (Cohen 1994; Pollard and Richardson 1987):
If A then B is highly likely | If a person is an American then it is highly unlikely
she is a member of Congress
Not B observed | The person is a member of Congress
Therefore A is highly unlikely | Therefore it is highly unlikely she is an American.
From this simple little example and the resulting absurdity it is easy to see that if the P(CongresslAmerican) is low (the p-value), it does not imply that P(Americanl Congress) is also low.
The ambiguous subject matter of economic analysis is thus not once but twice removed from the logical syllogism of modus tollens. Is it any wonder that economists keep trying to shine their boots with poop? I repeat: risk is not ambiguity, lime is not coconut, probability is not logic, coconut is not hedgehog. Therefore, the hedgehog is not a lime.

What does this have to do with the lump of labor? Plenty. Dudley Dillard (1988) called Say's Law a corollary of the Wages-fund doctrine. John Wilson (1871) denounced a "trade unionists' version" of the by then discredited Wages-fund doctrine that latter came to be known as the "Theory of the Lump of Labour" (Alfred Marshall dubbed his version of the lump of labor fallacy, the fallacy of the fixed "work-fund" an obvious play on the old wages-fund notion). Raymond Bye, whose introductory economics textbooks were ubiquitous in the 1920s, 30s and 40s, denounced the lump of labor fallacy because it violated Say's Law. Jevons's Paradox... And so, on and on we go, round and round, where it stops nobody knows. Say's Law is/isn't Say's Law is/isn't the Wages-fund doctrine is/isn't the lump of labor fallacy is/isn't Say's Law.

Just remember: ambiguity is not risk, risk is not logic, ambiguity is not logic.