Tuesday, April 14, 2020

World Chess Championship Ends

The two best chess players in the world faced off this month, undeterred by lockdowns, travel bans or any other restrictions.  They never had to see each other either.

It helped that they were both computer programs.  The former champ, Stockfish, is the strongest of the traditional type of program, designed by humans and invested with all the fine points of judgment the best human players can translate into code.  Because of its tremendous calculating abilities, it is rated far higher than the top flesh and blood competitors: 3600 to 2800+ for Magnus Carlsen and his closest challengers.  (The numbers are measured on the Elo scale, named for physicist and chess enthusiast Arpad Elo, who developed it over 50 years ago.  Players' scores rise and fall based on how they do against other rated players.  I had the pleasure, long ago, of sipping homemade cordial at Arpad's modest home in Milwaukee.)

But the new top performer is lc0, Leela Chess Zero, a pure implementation of machine learning.  No one told it how to calculate or evaluate; it played millions of games with itself and learned through experience how to make the best moves.  As a result, it has odd blindspots (poor appreciation for fortresses, for instance) but also finds strategies no human would ever consider.  It has a rating a little higher than Stockfish's, and the gap will be wider still after the latest match.

They played 200 games.  Most were drawn, but Leela came out on top, 106-94.  The format was a series of two-game mini-matches in which the opening moves were preselected, and the programs had a chance to play the resulting position once from each side.  (One opening was botched by the organizers; it incorporated a blunder that, at this level, ensured a win for Black.  This didn't change the final spread, but it effectively made the match 198 games rather than 200.)

You can see the whole match here.  Make sure you click on "View Crosstable".  If you see nothing else, check out game 169, which I predict will be regarded as some kind of watershed.  Stockfish, with superhuman calculation chops, rated over 700 points higher than the nearest human, thought it had a pull when it made its 15th move, simultaneously threatening a bishop and a pawn.  Then Leela came back with 16. f5, and immediately Stockfish re-evaluated, its assessment dropping radically.  If computers can have an "ooooh shit" moment, this was it.  And the move can only be called extraordinary.  Leela simply gives up the bishop with no clear followup or compensation, yet somehow, several moves later, Stockfish finds there is simply no defense.

This game is amazing in its own right, but it also expresses the "romanticism" of Leela's self-taught playing style.  She is cavalier about material, much preferring easier play and more scope for her pieces to having more "stuff".  She can't be bothered to keep track of who has the most pawns, and she makes exchange sacrifices with abandon.  If god has come down from silicon heaven to show us how the game should be played, she turns out to a lot wilder than we expected.

1 comment:

Anonymous said...

I know nothing about chess, but this is fascinating in terms of the description of machine learning.