Chess will never be solved, here's why

Sort:
playerafar

There seems to also be a contradiction in place -
between 'generates' and 'considers'.
The term 'generates' would make Much more sense and fit much better with 'nodes per second' than 'considers'. 
But 'generates' isn't making a single little Dent in 'solves' ...
not even a scratch on the paint.  Not a ding.
Not even in 'weak solving'.  evil.png
even though its much more valid-looking in this context - than 'considers'.

tygxc

#1855
Tickler should know the difference between floating point operations, integer operations and boolean operations. Solving chess requires not 1 single floating point operation. Not 100 but 0.

Tickler's post is wrong on other counts as adhering to 10^44, while all randomly sampled positions found legal by Tromp contain multiple underpromotions to pieces not previously captured, which cannot occur in a reasonable game with reasonable moves and thus not in an ideal game with optimal moves.

"Regarding taking the square root of the number"
Weakly solving chess requires to consider less positions than strongly solving chess.
Losing Chess has been weakly solved considering 900,000,000 positions only.

tygxc

#1856
Please see the reference article, that is what it states.
Make no mistake: those cloud engines of over 10^9 nodes / s are beasts.
They are 1000 x faster than a desktop, which typically reaches 10^6 nodes / s.
That is not excessive: those very fast cloud engines being 1000 times faster than a desktop.

MARattigan
tygxc wrote:

#1855
Tickler should know the difference between floating point operations, integer operations and boolean operations. Solving chess requires not 1 single floating point operation. Not 100 but 0.

Are you saying that the operating system, with which all applications interact, or the language(s) or NNUE functions used in SF14, your main vehicle as I understand, don't use floating point instructions? Can you convincingly show that? 

Just because you plan to code no floating point operations doesn't necessarily mean they won't happen.

The point about FLOPs is that computer manufacturers usually aim to balance speed between general, fixed point and floating point operations for usual applications of the type run in their prospective market. While computer speed is not a simple subject, FLOPs can usually be taken as a rough guide. No manufacturer quotes chess nodes per second. 

Tickler's post is wrong on other counts as adhering to 10^44, while all randomly sampled positions found legal by Tromp contain multiple underpromotions to pieces not previously captured, which cannot occur in a reasonable game with reasonable moves and thus not in an ideal game with optimal moves.

@btickler is not wrong. You are wrong. Firstly, "reasonable" and occurring in a weak solution are not connected and, secondly, occurring in a weak solution and occurring in your proposed method of finding one are connected only in that the former is a miniscule subset of the latter.

"Regarding taking the square root of the number"
Weakly solving chess requires to consider less positions than strongly solving chess.
Losing Chess has been weakly solved considering 900,000,000 positions only.

Ergo, you can assume the square root???

 

playerafar

More and more - the term 'nodes' might be proven to be entirely invalid in the context of the forum topic.
What is the speed of your computer in 'nodes' per second as they pertain to the single opening position of chess?
The speed is Zero.  It cannot even do one node in a second or even in a year.  On that position.

Suggestion:  don't tell us about desktops.
You want boolean or integer.
Why don't you post at what speed the computer attacks any position in those operations per second ?  
I don't think you're going to.

tygxc

#1860
"The square root jump looks like a kind of 'cheating' to me."
Losing Chess has been solved considering only 900,000,000 positions that is not even the 4th root of 10^36.

"speeds of the computers in those units instead of jumping to 'nodes'?"
Because time to solve chess = number of positions to consider x time to consider 1 position

playerafar
tygxc wrote:

#1860
"The square root jump looks like a kind of 'cheating' to me."
Losing Chess has been solved considering only 900,000,000 positions that is not even the 4th root of 10^36.

"speeds of the computers in those units instead of jumping to 'nodes'?"
Because time to solve chess = number of positions to consider x time to consider 1 position

You aren't telling us the reason you refuse to post the speed of the computer in boolean or integer units.
That's becoming more obvious now.  
And using a fourth root is even more blatant 'cheating' than using a square root.

tygxc

#1861
My desktop considers around 10^6 nodes / s.
You can see the number on your desktop when you run an engine on it.
That a cloud engine is 1000 times faster should be no surprise.

Elroch

Losing chess is an even poorer guide to difficulty than checkers because it has a winning strategy. I explained why this can be expected to provide a lower exponent.

Checkers is more directional than chess (it's like it only has pawns!) which is also likely to provide a lower exponent than chess.

So don't be confident of needing to evaluate fewer than (10^44)^(2/3) positions, say.

playerafar
MARattigan wrote:
tygxc wrote:

#1855
Tickler should know the difference between floating point operations, integer operations and boolean operations. Solving chess requires not 1 single floating point operation. Not 100 but 0.

Tickler's post is wrong on other counts as adhering to 10^44, while all randomly sampled positions found legal by Tromp contain multiple underpromotions to pieces not previously captured, which cannot occur in a reasonable game with reasonable moves and thus not in an ideal game with optimal moves.

@btickler is not wrong. You are wrong. Firstly, "reasonable" and occurring in a weak solution are not connected and, secondly, occurring in a weak solution and occurring in your proposed method of finding one are connected only in that the former is a miniscule subset of the latter.

"Regarding taking the square root of the number"
Weakly solving chess requires to consider less positions than strongly solving chess.
Losing Chess has been weakly solved considering 900,000,000 positions only.

Ergo you can assume the square root???

 


I don't think its necessary to even know that @MARattigan and @btickler are professional programmers with credentials and experience here -
because their posts are obviously more objective and informed and accurate and they don't seem to want to hide information.

Halving the length of the number of positions cuts that number down much much more than dividing it by 2 ...  
and it looks like cheating to me.
Square root tends to do that.  With a number like 9 you're only cutting it by two thirds by taking the square root.
But with a number like a trillion ...  when you take a square root -
you're cutting it down to a millionth of what it was.
It may not look that bad - because the number of zeroes behind the '1' is still 6 instead of 12.  But its 'bad' - real 'bad'.  happy.png

Analogy:  You're buying a new car.
Salesman:  "The price of the car is $40,000."
Customer:  "I want a discount.  And I was born in the same town you were."
Salesman:  "Okay - we'll charge you the square root of the price"
$200.

But whoever is pushing 'five years to weakly solve chess' while at the same time refusing to post how fast/slow the computers involved are in booleans/second or integers/second or FLOPS/second ...
maybe he doesn't know?   
Why push if there's critical missing information?
Doesn't care?  Then why push ? happy.png

Another piece of information missing - and this next one can't be blamed on anybody - how much computer time needed to solve just one position?
Mathematician:   "Insufficient Data.  What position are you talking about?  There's a lot of variation.  Putting it mildly."

DiogenesDue
Optimissed wrote:

You seem to be gradually arriving at what I was pointing out 1000 posts ago. Well done! Getting there fast!

Actually, this was an update to an argument I made in 2017.  I just updated the FLOPS to 422 to reflect the fastest supercomputer of 2021.  Glad you finally caught up, though wink.png.

DiogenesDue
Elroch wrote:
Optimissed wrote:

I still think that the best way is games and not positions, 

[snip]

You've just contradicted yourself, since @btickler surely disagrees with this.

Yep.

MARattigan

@tygxc

#1858 "Tickler's post is wrong on other counts as adhering to 10^44...found legal by Tromp..."

I have to concede @btickler didn't get it quite right in context.

The fact is, while you seem to be a little schizophrenic on the point, I think you are proposing to solve the competition rules game.

Tromp's bounds apply to the basic rules game.

In the competition rules game the relevant description of a position (and the one SF14 needs to function as designed) would be effectively a PGN starting with a FEN with a zero ply count (you can throw away venue, player names etc.).

To illustrate the difference, here I am playing myself in Arena and attempting to mate with a king and rook against a lone king.

 but I'm not doing too well, because after five moves I find myself right back where I started.

But help is at hand because I have a copy of Wlhelm and the 3-5 man Nalimov tablebases, so I can ask them.

Nalimov says I should play Rh3, so I go back into Arena and do just that.

Dang! I could have sworn there was a win there somewhere.

Of course I'm looking at the wrong tablebase, because Nalimov is a tablebase for the basic rules game and Arena, as arbiter, has decided I'm going to play (a flawed version of) the competition rules game. But if I make my moves on the syzygy-tables.info site instead of in Wilhelm, in this particular endgame the Sysygy tablebase, which is designed to handle the competition rules game, will always give the same moves as Nalimov anyway.

The upshot is, for the competition rules game, the number of legal positions in this endgame is not the 49,389 assigned by Tromp in the basic rules endgame, but 49,389 x 100 x 3⁴⁹³⁸⁹ⁿ  where the factor 100 accounts for the possible different ply counts and the last factor accounts for the number of times each diagram with the same material and side to play has already occurred (which can be 0, 1 or 2), n being a rather small fraction since many combinations will be impossible.

Positions with a repetition count of 2 will not occur in any weak solution and tablebase generation will avoid these. SF14 however, as you plan to use it, will happily walk into them and they therefore have to be (the major) part of your search space.

At any rate there may be more legal positions in this endgame alone in the competition rules game than there are in the whole of the basic rules game. I leave it to you to work out how many there are in the whole competition rules game.  

 

DiogenesDue
tygxc wrote:

#1855
Tickler should know the difference between floating point operations, integer operations and boolean operations. Solving chess requires not 1 single floating point operation. Not 100 but 0.

Tickler's post is wrong on other counts as adhering to 10^44, while all randomly sampled positions found legal by Tromp contain multiple underpromotions to pieces not previously captured, which cannot occur in a reasonable game with reasonable moves and thus not in an ideal game with optimal moves.

"Regarding taking the square root of the number"
Weakly solving chess requires to consider less positions than strongly solving chess.
Losing Chess has been weakly solved considering 900,000,000 positions only.

FLOPS is the standard measure of speed, regardless.  Boolean and integer operations are not used much as a measure of processing speed, much in the same way that nobody cares what speed a car can achieve in 1st and 2nd gear.

Go ahead and dig up some conversion rate for boolean and/or integer operations if you like...I don't have any need of doing more homework.  I'm betting the difference is an order of magnitude, a couple at best. 

"Nodes" is eminently fudge-able.  FLOPS is not.  

P.S. You gave up the ghost when you later said positions are "considered, not solved", really.  

DiogenesDue
MARattigan wrote:

@tygxc

#1858 "Tickler's post is wrong on other counts as adhering to 10^44...found legal by Tromp..."

I have to concede @btickler didn't get it quite right in context.

Tromp's number is currently the best one.  Any number that discards "excess" (as determined by human players' notions of chess) promotions, or arbitrates "reasonable positions" based on human-derived valuations of chess is garbage.  If anyone can prove illegality of more positions, go for it.  Until then, 10^44.5 is the number.  Before Tromp's study, it was 10^46.7.

snoozyman

https://www.youtube.com/watch?v=C7p4lM7bl3U

 

MARattigan
btickler wrote:
MARattigan wrote:

@tygxc

#1858 "Tickler's post is wrong on other counts as adhering to 10^44...found legal by Tromp..."

I have to concede @btickler didn't get it quite right in context.

Tromp's number is currently the best one.  Any number that discards "excess" (as determined by human players' notions of chess) promotions, or arbitrates "reasonable positions" based on human-derived valuations of chess is garbage.  If anyone can prove illegality of more positions, go for it.  Until then, 10^44.5 is the number.  Before Tromp's study, it was 10^46.7.

Sorry - disagree. Tromp's numbers are for the basic rules game. I think @tygxc wants the competition rules game.

See #1869.

But I agree that discarding positions on the grounds of "reasonableness" in either case is just feeble minded.

DiogenesDue

The checkmate in 545 moves is a good illustration of how far away humans and engines are from proving anything.  After 30 minutes on this already simplified 7 man position, Stockfish 14 has only reached a depth of 38 ply and evaluates this forced mate for black as a 0.27 centipawn advantage for white.

And you want to use Stockfish to bridge from 32 pieces to the 7 man tablebase?

MARattigan
btickler wrote:

The checkmate in 545 moves is a good illustration of how far away humans and engines are from proving anything.  After 30 minutes on this already simplified 7 man position, Stockfish 14 has only reached a depth of 38 ply and evaluates this forced mate for black as a 0.27 centipawn advantage for white.

And you want to use Stockfish to bridge from 32 pieces to the 7 man tablebase?

If SF14 plays it against itself, then by the time it reaches 38ply I would guess the probability of it being in a theoretically White winning position, a theoretically Black winning position or a draw would be around 1/4, 1/4 and 1/2 respectively, but the actual outcome in any case would be a draw with high probability.

Elroch

Why would you believe that? Seems highly likely to be a draw to me. More so than another engine with slightly differing biases.