Chess will never be solved, here's why

Sort:
MARattigan
The_Nameless_Artist wrote:
MARattigan wrote:
tygxc wrote:

...

#14
Chess even stays a draw if stalemate = win.
The paper shows that the draw rate increases with more time.
Compare figure 2 (a) and (b).

The time spent by AlphaZero also means nothing in terms of theoretically accurate play.

The paper doesn't show that chess stays a draw even if stalemate = win. It shows that Alpha Zero can't see a forced win in either case.

But AlphaZero or any current engine has a very limited look ahead capability.

I can't run AlphaZero but here is LC0 based on the same kind of software playing a drawn position with only five men on the board. It blunders into a loss on it's second move. It's not even a very accurate blunder. The longest forced mate with the black pawn on h3 is 71 moves but it blunders into a mate in 56 (according to Nalimov) and collapses in 21 instead.

 

If Haworth's law (http://centaur.reading.ac.uk/36276/3/HaworthLaw.pdf) continues to hold up to 32 men there would be winning positions where the forced mates need at least  tens of trillions of moves against accurate defence. The performance of anything that plays chess in much simpler positions such as the one above hardly inspires confidence in their assessments.

what

What what?

If you're querying the tens of trillions I have to admit to some unreliability on the back of my envelope. I've run it through Javascript and it should have read 3 trillion.

Elroch

I'm going to be completely balanced and say that, in principle it is possible (but by no means obvious) that one side could generate a strategy that was not only optimal but also avoided large regions of the set of possible material balances (such as more extreme over-promotions).

The problem is that that is merely a hypothesis, based on inadequate intuition and may be false,

To give an idea why it might be false, it's really about the big numbers. Say you have 10^30 positions to solve (some part of the space of positions in your strategy and you want to avoid black having, say, 4 knights. This might be possible to ensure almost all of the time, but it's no good achieving it 99.9999999999999999% of the time. If the other side has one chance to be a pain and underpromote multiple times to knights (or more extreme examples) they need to be dealt with. It might be that the defender needs a million times more positions to be able to wangle another underpromotion in one of them. But if so, allowing 5 could be unavoidable with 10^30 positions.

What would be more likely to be achieveable would be to avoid unusual excess promotions on both sides. For example it seems reasonable that if there is a drawing strategy for a side, there is a drawing strategy that only involves promotions to queens (reasoning is that underpromotions appear fairly unusual in best play (<1% of games?), so unless the path to a draw is very narrow you can probably find a part of it that doesn't have any for you side. So that would reduce the range of material balance significantly, if nowhere near as much as a certain person here wants.

Elroch
MARattigan wrote:
The_Nameless_Artist wrote:
MARattigan wrote:
tygxc wrote:

...

#14
Chess even stays a draw if stalemate = win.
The paper shows that the draw rate increases with more time.
Compare figure 2 (a) and (b).

[snip]

If Haworth's law (http://centaur.reading.ac.uk/36276/3/HaworthLaw.pdf) continues to hold up to 32 men there would be winning positions where the forced mates need at least  tens of trillions of moves against accurate defence. The performance of anything that plays chess in much simpler positions such as the one above hardly inspires confidence in their assessments.

what

What what?

If you're querying the tens of trillions I have to admit to some unreliability on the back of my envelope. I've run it through Javascript and it should have read 3 trillion.

What about that surprising result that the longest mate discovered with 8 pieces was shorter than the longest one in the (complete) 7 piece tablebase. Is that still true?

See https://www.chess.com/blog/Rocky64/eight-piece-tablebases-a-progress-update-and-some-results

MARattigan
tygxc wrote:

#1747
Definitions are clear:
"weakly solved means that for the initial position a strategy has been determined to achieve the game theoretic value against any opposition"
https://web.archive.org/web/20170912011410/https://pdfs.semanticscholar.org/8296/bc0ab855841088b31190c9f2923951853d7b.pdf 
...

Actually, definitions in the links you have so far provided are not clear.

The first link you provided here was  Wikipaedia

Weak

Provide an algorithm that secures a win for one player, or a draw for either, against any possible moves by the opponent, from the beginning of the game. That is, produce at least one complete ideal game (all moves start to end) with proof that each move is optimal for the player making it.

This gives two incompatible definitions in its two sentences.

If the game is basic rules chess but with arts. 2.2 and 2.3 replaced to give this as the starting position

 

then the algorithm for White 

Play 1.Qe7

secures a draw for White against any possible moves by Black, so counts as a weak solution according to the first sentence. But it doesn't lead to a complete ideal game with proof that each move is optimal for the player making it, so doesn't count as a weak solution according to the second sentence.

If instead the rules are adjusted to make this the starting position

 

and an extra art. is added to the effect that if a king steps onto a corner square the board is replaced by a large block of chocolate with a poisoned corner square and the game continues (with the opponent to move) according to the rules of Chomp, then the game

1.Nc3+ Kc1

2.Ne2#

is a complete game where it's possible to prove that each move is optimal for the player making it, but it doesn't lead to an algorithm that secures a win for one player, or a draw for either, against any possible moves by the opponent, from the beginning of the game, so it's a weak solution according to the second sentence, but not the first.

If the starting position is amended to 

 

then the moves 

2n+1. Kh8

2n.      Kg8

n∈ℕ

together with any choice of legal responses for black provide a complete game in which it can be proved that each move is optimal for the player making it, so it is a weak solution according to the second sentence and it also provides an algorithm that secures a draw for White against any possible moves by Black, so counts as a weak solution according to the first sentence too. 

But it's not what anyone else would recognise as a weak solution, and neither are the previous.

The second link you provided here to this paper

defines "weakly solved", as you say, by

for the initial position a strategy has been determined to achieve the game theoretic value against any opposition

If the starting position is taken to be

 



then the strategy for White

Play 1.Qa7+

achieves the game theoretic value against any opposition. But again it's not what anyone would recognise as a weak solution.

Edit: added black pawn c7 to correct last diagram.

MARattigan
Elroch wrote:
MARattigan wrote:
The_Nameless_Artist wrote:
MARattigan wrote:
tygxc wrote:

...

#14
Chess even stays a draw if stalemate = win.
The paper shows that the draw rate increases with more time.
Compare figure 2 (a) and (b).

[snip]

If Haworth's law (http://centaur.reading.ac.uk/36276/3/HaworthLaw.pdf) continues to hold up to 32 men there would be winning positions where the forced mates need at least  tens of trillions of moves against accurate defence. The performance of anything that plays chess in much simpler positions such as the one above hardly inspires confidence in their assessments.

what

What what?

If you're querying the tens of trillions I have to admit to some unreliability on the back of my envelope. I've run it through Javascript and it should have read 3 trillion.

What about that surprising result that the longest mate discovered with 8 pieces was shorter than the longest one in the (complete) 7 piece tablebase. Is that still true?

See https://www.chess.com/blog/Rocky64/eight-piece-tablebases-a-progress-update-and-some-results

The 8 man tablebases are DTC so the lengths of the mates are variable and you don't get a maximum mate length. Only a small percentage of classifications so far produced.

They will, when complete and in conjunction with the 7 man Nalimov/Lomonosov tables give an upper bound on the longest mates.

The endgames with the longest DTC values don't necessarily correspond with the longest mates because the DTC values relate only to the length of the first phase. 

For the tables Marc has produced the pawnless 8 man DTC values are less than those for the longest 7 man DTC values. 

This comes as no surprise to me. The longest pawnless mate depths rise faster than exponentially between 3 and 6 men. From 6 to 7 men the increase collapses straight back to the increase from 3 to 4 men. This could be viewed as another successful prediction of Haworth's law, because had the increase not collapsed the longest mate for 7 men would have substantially exceeded Haworth's predicted value.

In fact so far only one longest mate for n men has turned out to be pawnless. The ratio of pawnless to pawny endgame classifications decreases sharply as the number of men increases so the 6 man longest KRNvKNN ending could turn out to be the last and only pawnless one.

Marc has produced about 15% of the pawnless classifications (plus a few others). That includes all the pawnless endgames with a standard notional difference in material of 4 prawns or less except possibly the endgames

KQQvKRRRB
KQQvKRRRN
KQQRvKQQR
KQQBvKQQB
KQQBvKQQN
KQQNvKQQN
KQRBvKQRB
KQRBvKQRN
KQRNvKQRN
KQBBvKQBN
KQBNvKQBN
KQBNvKQNN
KRRBvKRRN
KRBNvKRBN
KRBNvKRNN

which don't appear in his spreadsheet. Whether these have been produced but never made it to the spreadsheet or have yet to be produced I don't know, but some of them look like they could be a bit beefy. 

I'm still expecting the longest 8 man mate to exceed 1200 moves when the DTM tables are eventually produced.

Obviously all that relates to the basic rules game.

playerafar

That's really getting into it there.
The issue of increasing difficulty as more pieces are added.
How much it increases - the 'ratio of difficulty' would have to increase too ...  as would the ratio of new positions to previous ones.  

Regarding the original 'two Kings only' positions - it would be 'easy' to repeat positions.
But the idea would be to set it up so it can accomodate the progressing tablebases properly.   That's much harder than I thought.

Elroch

This forum just disappeared from my my_posted_in_topics list.

An annoying new bug in the chess.com forums. (Posting in it puts it back in the  list).

playerafar
haiaku wrote:
tygxc wrote:

#1747
Definitions are clear:
"weakly solved means that for the initial position a strategy has been determined to achieve the game theoretic value against any opposition"
https://web.archive.org/web/20170912011410/https://pdfs.semanticscholar.org/8296/bc0ab855841088b31190c9f2923951853d7b.pdf 
The game theoretic value of chess is a draw.
Drawing for white is easier in chess than drawing for black due to the extra tempo.
A strategy for chess consists of a game tree.
Weakly solved for chess thus means that from the initial position for all (reasonable) white moves at least 1 black move has been found that draws.
That is also how Checkers and Losing Chess have been weakly solved.

Please, you cannot stretch things as you like. To claim that the game is weakly solved, one has  to prove that the game value is a draw, not infer that from the experience we have gathered so far. And one needs to prove that the optimal strategy provides at least the game-theoretic value against all the opponent's possible moves, not only the "reasonable" ones. Is "reasonable move" defined in game theory?


I'm getting the impression that @tygxc may be stretching or misinterpreting the meaning of the word 'nodes'.
If you say "10^9 nodes computer" and then divide that into the number of positions to be solved ...  that's invalid.
Is the claim to be that computers can solve a billion positions per second - simply by using the term "10^9 nodes computer"  ???
happy.png
nodes must be distinguished from ops.  
At one petaflop - a computer performs one trillion ops per second.
At that speed it could only address 1000 ops each to 10^9 'nodes'.
In one second.  
10^9 is a billion.  
Calling a position a 'node' has probably slipped past many readers here.
1000 ops might not even define a position - let alone examine it - let alone further - define its future move sequences - let alone further than that 'solve' it.
If the computer ran at 1000 petaflops (apparently there aren't any yet)
then it would still only have a million ops per chess position (node).
If by some enormously generous arbitration that was seen as 'enough' to solve ....   then in one year - it could solve around 32 million billion positions.
There are over 31 million seconds^ in a year.
But that's only about 32 x 10^15 positions solved in a year.  
If you've got 32 x 10^38 positions to solve then that's going to take you
100 billion trillion years.  evil.png

MARattigan
Elroch wrote:

This forum just disappeared from my my_posted_in_topics list.

An annoying new bug in the chess.com forums. (Posting in it puts it back in the  list).

Did that to me too.

playerafar

It was doing that to me a lot too.
So I just started clicking the white Connect button at left -
and then 'Forums' and then Following at right.

tygxc

#1776
FLOPs are Floating Point Operations, those play no role in chess, as chess only requires boolean operations and small integers.

"nodes per second is the number of positions the engine considers in one second"

https://chessify.me/blog/nps-what-are-the-nodes-per-second-in-chess-engine-analysis#:~:text=Nodes%20per%20second%20(NPS)%20is,thousand%20or%20one%20million%20respectively.

Thus with 10^9 nodes per second: number of positions = time in nanoseconds

tygxc

#1772
The official definition is that of game theorist van den Herik.
The wikipedia article explains in layman's terms what it means.

playerafar
tygxc wrote:

#1776
FLOPs are Floating Point Operations, those play no role in chess, as chess only requires boolean operations and small integers.

"nodes per second is the number of positions the engine considers in one second"

https://chessify.me/blog/nps-what-are-the-nodes-per-second-in-chess-engine-analysis#:~:text=Nodes%20per%20second%20(NPS)%20is,thousand%20or%20one%20million%20respectively.

Thus with 10^9 nodes per second: number of positions = time in nanoseconds

FLOPS play an obvious role in chess solving as they measure the speed of the computers involved.
How could you say they have no 'role' ?  happy.png

and you've just confirmed again what I suggested -
you're claiming that computers could 'consider' 10^9 positions per second ...
and that's what it is.  A 'claim'.
Why didn't you pick 10^20 positions per second ?  happy.png

If you're claiming that computers can solve 10^9 positions per second as an argument to support 'chess solved in five years' ...  its not much of an argument.
As to why previous posters didn't rip that to shreds already ...
could be because the term 'nodes' slipped by.
'Nodes' looks like a measure of speed - like FLOPS ...
but it isn't.  Its an unsubstantiated assertion of capabilitiy about 'positions'.
Very very different.  

It would be like saying - 'Well my racing car engine develops 10,000 rpm's and we can develop 600 mph on the straightaways easy !  "
Or :  "We've got a million tons per hour steam shovel"  

Its a contrivance.  I wonder if the website involved is really claiming that.
Perhaps they're claiming that in special circumstances they've been able to 'solve' some positions at that rate.  For a little while.
R against P positions might solve at that rate ...
Positions one move backwards from already-generated checkmates would 'solve' pretty fast  !  

playerafar

Lets say that some computers really could 'consider' a billion positions per second.
That could still be Intensely Misleading.
That could mean - that in trying to solve a single position - they could 'consider' a billion positions that are variations on that position - at various depths.
That is Completely Different from 'solving' a billion positions per second
!  
happy.png
I even 'gave' you that a few posts ago - and it Still would take many Trillions of Years !  grin.png

I had a wooden whistle.  Didn't work.
Switched to a metal one -   and it Steel Wooden Whistle !

tygxc

#1781
"1 million kN/s and even higher if needed"
1 million kN/s = 10^9 nodes / second
https://chessify.me/blog/nps-what-are-the-nodes-per-second-in-chess-engine-analysis 

FLOPS are floating point operations per second. That is like dividing the square root of 2 by pi. Such floating point operations play no role in chess. Chess only requires boolean operations and small integers. Operations on booleans and on small integers are much faster than operations on floating point numbers like pi. Thus nodes / second is the appropriate measure for the speed of chess engines. 
"nodes per second is the number of positions the engine considers in one second"
To consider is not only to generate the daughter FENs from the mother FEN, but also some evaluation however rudimentary.
From the TCEC superfinals we know that Stockfish with more nodes / second and a simpler evaluation function could defeat LC0 with less nodes / second and a more sophisticated evaluation function, because it hit the 7-men endgame table base more and earlier. That is with limited calculation time. With ample calculation time like 60 h / move this becomes more pronounced,, as the table base is hit more and that returns the exact evaluation draw / win / loss.

tygxc

#1761
"to solve checkers, only the square root of the search space has been checked?"
https://www.researchgate.net/publication/231216842_Checkers_Is_Solved 
https://www.cs.mcgill.ca/~dprecup/courses/AI/Materials/checkers_is_solved.pdf 
State space: 500,995,484,682,338,672,639 positions
there are 10^7 positions in the stored proof tree
each representing a search of 10^7 positions

While we are at it:
Losing Chess:
http://magma.maths.usyd.edu.au/~watkins/LOSING_CHESS/ICGA2016.pdf
"Our final proof size is approximately 900 million positions"
Chess has 10^36 legal and sensible positions.
900 million is about the 4th root of this

"Sveshikov used 'close' chess, but I guess this is a poor translation of 'solve'."
I do not have the original Russian version of the quote, nor is my Russian any good. The fact that 'close' is between '' may indicate that the translator did not know the right word, which is 'solve' in this context.

"Once 1 e4 and 1 d4 are proven to draw, then there is no doubt that upon request a similar proof with the same method and effort can prove the draw against 1 a4 and 1 a3."

Even a pure mathematician should be able to understand that pieces in the center are more powerful than pieces on the edge. It is thus logical that 1 a4 and 1 a3 are no better than 1 e4 or 1 d4. Thus if black has a proven draw against 1 d4 and 1 e4, then it is trivial to prove a draw against 1 a4 and 1 a3 as well by the same method.
"From the outset two moves, 1.e4 or 1.d4, open up lines for the Queen and a Bishop. Therefore, theoretically one of these two moves must be the best, as no other first move accomplishes so much. " - Capablanca
This has even been corroborated in this peer-reviewed scientific paper without any human input but the rules of chess: see figure 31
https://arxiv.org/pdf/2111.09259.pdf 


"Once 1 e4 e5 Nf3 is proven a draw, then there is no doubt that upon request a similar proof with the same method and effort can prove at least a draw against 2 Ba6. Even more it is not even necessary to go through that effort: we know for sure 1 e4 e5 2 Ba6 loses for white even without a full calculation to checkmate."
Are you seriously doubting that 1 e4 e5 2 Ba6? loses for white?

haiaku
tygxc wrote:

#1761
"to solve checkers, only the square root of the search space has been checked?"
https://www.researchgate.net/publication/231216842_Checkers_Is_Solved 
https://www.cs.mcgill.ca/~dprecup/courses/AI/Materials/checkers_is_solved.pdf 
State space: 500,995,484,682,338,672,639 positions
there are 10^7 positions in the stored proof tree
each representing a search of 10^7 positions

Do not pretend to not understand

haiaku wrote:

Now can I ask you where did you find that, to solve checkers, only the square root of the search space has been checked?

 

tygxc

#1785
I had this from this publication:
10^9 positions out of 10^18
http://library.msri.org/books/Book29/files/schaeffer.pdf 
10^7 * 10^7 = 10^14, would be an exponent 0.67

MARattigan
tygxc wrote:

#1776
FLOPs are Floating Point Operations, those play no role in chess, as chess only requires boolean operations and small integers.

"nodes per second is the number of positions the engine considers in one second"

https://chessify.me/blog/nps-what-are-the-nodes-per-second-in-chess-engine-analysis#:~:text=Nodes%20per%20second%20(NPS)%20is,thousand%20or%20one%20million%20respectively.

Thus with 10^9 nodes per second: number of positions = time in nanoseconds

Read "number of positions ≠ time in nanoseconds".

haiaku
tygxc wrote:

#1785
I had this from this publication:
10^9 positions out of 10^18
http://library.msri.org/books/Book29/files/schaeffer.pdf 
10^7 * 10^7 = 10^14, would be an exponent 0.67

 

haiaku wrote:

Now can I ask you where did you find that, to solve checkers, only the square root of the search space has been checked?

Page and line, please.