I am late to issue this RIP as John Horton Conway died on April 11, 2020, having been born in England, Dec. 26, 1937. He died of coovid-19. I was aware of his death when it happened, but have since become aware of things he did that I did not know about that have pushed me to post this.
Conway was one of the world's best known mathematicians, most famous for creating the Game of Life a half century ago in 1970, which was publicized by Martin Gardner in Scientific American. It is what I knew of him mostly about, as the canonical cellular automata model that generated simulation modesl capable of generating chaotic and unprredictable emergent outcomes within a Turing complete framework, the sort of thing that goofy complexity theorists like me salivate over. However, it inspired similar models that have been used in nearly every science, and I am quite sure that such models are being used in the current research push to find a vaccine for the disease that did Conway in. It was an enormous achievement and enormously useful. He deserves recognition for this alone.
I never met him or even saw him speak, but by all accounts he was highly extroverted and lively to the point of becoming at least for awhile "the rock star of mathematics." Not unrelated with that he invented an enormous array of games, none of which I have ever played, but apparently he would invent them on the spot as he interacted with people he met. Of course in some sense the Game of Life is a kind of game, and Conway himself on more than one occasion claimed that doing mathematics is fundamentally a game.
I had known that he did a lot of work in other areas of math, but had not really checked it out in details, but have recently become more aware of just how widely across math his work varied and how important and innovative so much of it was. I shall not list all these areas and theorems and discoveries as it is a long list that will probably be meaningless to most of you if it is just put out here, but anybody who wants to see a pretty complete version of it, well, his Wikipedia entry provides a pretty thorough one.
Anyway, I shall talk a bit more about a couple of the more out there high level stuff that relates to things that my father and I have long been interested in. My late father was a friend of the late Abraham Robinson and someone Robinson consulted with at length when he developed non-standard analysis, presented in a book of that title in 1966. Non-standard analysis allows for the existence of superreal numbers that have infinite values, real numbers larger than any finite real number. The reciprocals of these numbers are infinitesimals, numbers not equal to zero but smaller than any positive real number. These are ideas originated by Leibniz when he independently invented calculus, and allowed for viewing derivatives as ratios of such infinitesimals, an essentially more intuitive way of doiing calculus.
This extension of real numbers into transfinite and infinitesimal values led to further expansions of what might be numbers, with a further extension being hyperreal numbers that can be constructed out of the superreals.
Then in 1979 Conway would push this even further by conceptualizing surreal numbers, an even broader set that includes within it all of these sets, reals, superreals, and hyperreals. To give an idea of how one constructs these numbers one should think in terms of numbers represented by their decimal expansions, which are in effect sums of ever smaller numbers, although in the case of a whol number, the numbers added after the first one are all zero. In any case, for surreal numbers these numbers that get added in the equivalent of the infinite decimal expansion can include powers of infinite and infinitesimal numbers, which can lead to an incredibly array of pretty strange numbers, think something like "infinity plus one," hence deserving the name, surreal numbers.
Another concept he coined, and which reportedly occupied much of his attention in recent years, is that of the Monster group, a name Conway coined (it is also tied to monster moonshine, with the "moonshine" part reflecting how "crazy" all this is, according to Conway). I confess that this object, whose existence has been proven, is something I do not fully understand, although apparently really understanding it was what had Conway so occupied. It is really very complicated.
So, drawing on a lot of ideas in abstract algebra, thi object is a group constructed out of 26 classes of algebras, collections of ordering functions and numbers and relations, finite and infinite, that have important connections with each other and that somehow in total cover the complete group of these mathematical structures. Something Conway did in 1981 was to calculate the dimensionality of the space that this object exists in, which happens to be 196,884. Indeed.
OK, this sounds indeed like something verging on lunacy, if not outright total lunacy, monster moonshine indeed, something of no use whatsoever. However, it turns out that maybe this monster is not as totally useless as it might seem. In 2007, leading string theorist Edward Witten wrote a paper that suggests that in fact this Monster Group may actually be useful in understanding string theory. Sorry, I am not going to try to explain how this might be or is the case, but there it is: this incredibly complicated idea to the point of total mind-blowing may in fact have an application to understanding the deepest structures of the physical world, as string theory is the leading approach to Grand Unified Field Theory of the universe.
So, while the Game of Life probably has more practical uses and applications, and is certainly a whole lot more accessible and comprehensible, Conway also has been operating at some of the most esoteric and difficult limits of mathematics, with some of the most difficult of all to understand may well have practical applications as well.
So, RIP, John Horton Conway.
Barkley Rosser
Wow, you're dad was involved in developing non-standard analysis which is what they called it when I took a seminar in calculus through non-standard analysis my freshman year. I remember those crazy huge integers and their inverses that slipped in between a real number and its lowest?least upper bound. (Hey, this was back in '71, so some of it has slipped out of my non-standard brain.) I enjoyed the course, but I wasn't sure until a year or two later why it was better than doing delta-epsilon proofs.
ReplyDeleteBTW, Mathematics has always seemed a lot like avant garde art. A lot of the stuff seems way out weird and utterly useless until it is suddenly precisely the metaphor.
ReplyDeleteKaleberg,
ReplyDeleteWhen you took that course, did you use the textbook by Jerome Keisler?
What a terrific essay.
ReplyDeleteI note that I know Jerry Keisler, who is still alive and was at Wisconsin for a long time, where my late old man played a role in getting him hired. That textbook has a red cover (I have a copy of it), but despite its greater intuitiveness, I gather intro calc is still largely taught with the cluniky delt-epsilon proof approach. Jerry proved a major theorem about hyperreal numbbers.
ReplyDeleteI also note that the major core theorems of standard neoclassical general equilibrium theory were redone using non-standard analysis in the 1970s, most notably Robert Anderson of Berkeley and Donald Brown of Yale, one of the small number of African Americans doing high-powered math econ theory. This basically gives one the same final outcomes, just going at it differently.
Oh, and I might add that Conway was at Cambridge U. in UK in his early careeer, where he got his PhD, but moved to Princeton in 1977 where he remained until his passing.
In the 19th century, when they formalized calculus and the real numbers, mathematicians, as I understand it, reached a consensus that infinitesimals were no-longer-needed non-rigorous nonsense. So non-standard analysis is quite a surprise. Model theory is something I will never understand.
ReplyDeleteThe classification of finite simple groups is another thing I will never understand, even though I tried to write up what I understand the problem to be. Supposedly, Conway was one of a handful that understood the proof. This classification theorem is supposedly one of the triumphs of mathematics in the last half-century. It is also a good example for the philosophy of math. Is a proof a proof when it is distributed over thousands of journal papers? I do not know how far they have gotten in writing up the second generation proof in multiple volumes.