Chess will never be solved, here's why

Sort:
Elroch

I provided a convincing intuitive argument why the square root of the number of positions may be insufficient in the likely event that chess is a draw. Chess has key differences from checkers that tend to increase the exponent above 0.5.

MARattigan
tygxc wrote:

#1819
It is rather otherwise:
Chess solving financier: how many chess positions can your cloud engine consider per second?
Expert: our computer can do floating point operations quite fast
Chess solving financier: solving chess does not need any floating point operations so I do not care how many floating point operations it can do.

Or otherwise still:

Chess solving financier: What's the difference between a diagram with no excess promotions and a position?

Expert: Haven't got the foggiest.


the Expert takes a Walk.  A one-way walk.

llama51
Elroch wrote:

I provided a convincing intuitive argument why the square root of the number of positions may be insufficient in the likely event that chess is a draw. Chess has key differences from checkers that tend to increase the exponent above 0.5.

Nice.

DiogenesDue

FLOPS (that is, Floating Point Operations Per Second) is the better measure, but any "financier" worth his salt would demand both numbers with supporting documentation.  Number of chess positions evaluated per second is highly variable, and any snake oil salesman could and would present the most "simple" set of nodes as representative of the whole, while also pushing a limited and ultimately threadbare "evaluation" function...between these 2 fudges, you could knock of an order of magnitude.  Which is miniscule in terms of solving chess, but enormous in terms of cost/benefit analysis.

Today, the fastest Supercomputer is 442 PetaFLOPS.  It costs an estimated $1 billion dollars.  Traversing the tree of chess positions and picking the next one, evaluating the position fully, and classifying/storing the result is *at best* in the 10s of operations, but much more likely in the low hundreds of operations.   If you assume that fully evaluating a chess position takes 100 operations, then:

31,536,000 seconds in a year

442 PetaFLOPS = 442000000000000000 operations/second

442000000000000000 / 31,536,000 = 14015728056.8 operations/year

14015728056.8 operations/year / 100 operations per position evaluation = 140157280 positions/ year

10^44.5 positions to be evaluated (Tromp's number, best estimate currently available)

10^44.5 positions / 140157280 positions/year = 2.256235^36 years for the fastest supercomputer to weakly solve chess (almost guaranteed)

80 trillion = estimated worldwide wealth

$80 trillion / $1 billion = 80,000 supercomputers

2.8202937^31 years for 80,000 supercomputers to weakly solve chess (almost guaranteed)

Square root of 10^44.5 = 1.7782794^22

1.7782794^22 / 140157280 = 1.2687742^14 years for the fastest supercomputer to weakly solve chess (if the square root premise is proven to be sufficient, which is not currently the case)

1,585,967,750 years for 80,000 supercomputers to weakly solve chess (if the square root premise is proven to be sufficient, which is not currently the case, and if the entire human race is left to perish and die from lack of resources due the whole world's wealth being spent on the effort)

The *only* way Tygxc's premise works is to chop another 9 orders of magnitude off, *then* take the square root, *then* equate apples to oranges by implying that Stockfish/Sesse positions/second is equivalent to fully evaluating the position at a "tablebase solid" level of precision.

Elroch
Optimissed wrote:

You seem to be gradually arriving at what I was pointing out 1000 posts ago. Well done! Getting there fast!

I still think that the best way is games and not positions, 

[snip]

You've just contradicted yourself, since @btickler surely disagrees with this.

playerafar


Regarding @btickler 's good post just now ...
I don't mean to nitpick - but just a technical point ..

"1.7782794^22 / 140157280 = 1.2687742^14"

didn't our learned participant mean  10^22 and 10^14 years ....
were the 10's left out ?
If so - its just a typo ...   but best to have them in ...  
Yes?

playerafar


"Its Chess !"



"It ain't Checkers !"
(and its a picture - not a video link.  I didn't want the link - too many explicit terms)

tygxc

#1834
"I provided a convincing intuitive argument why the square root of the number of positions may be insufficient in the likely event that chess is a draw. Chess has key differences from checkers that tend to increase the exponent above 0.5."
++ That is all very well. Losing Chess has been weakly solved considering 900,000,000 positions.
How many positions do you think need to be considered to weakly solve chess?

tygxc

#1837
"10^44.5 positions to be evaluated (Tromp's number, best estimate currently available)"
I have refuted this many times.

All positions as randomly sampled by Tromp and found legal contain multiple underpromotions to pieces not previously captured. These positions can never occur in a reasonable game with reasonable moves and thus not in an ideal game with optimal moves. Hence the Tromp count 10^44 is much too high for the purpose of assessing the feasibility of solving chess. Gourion's count 10^37 is better.
You can up this to 10^38 to account for some reasonable promotions to 3 or 4 queens.
For every position with black to move there is a diagram with white to move by up/down symmetry.
For every diagram with lost castling rights there is a left/right symmetrical diagram.
Then it has to be reduced by a factor about 10^6 as 10^4 randomly sampled positions without promotions to pieces not previously captured cannot occur in reasonable games with reasonable moves either.

Weakly solving chess requires to consider less positions than strongly solving chess.

Solving chess does not require one single floating point operation. FLOPS is the wrong metric and nodes/s is the right metric for the purpose of weakly solving chess.


playerafar


Not much point talking about checkers.  Its not chess.  See post #1841.
Not much point talking about 'nodes' either.  That's contrived.
You may as well say - "we can repeatedly 'solve chess' every five years."
And 'nodes is our metric because we say so'.
If the intent is to talk about how many positions a supercomputer could thoroughly solve - that's going to be tough.
Because the opening position is just one position - and its not 'solved'.
Want to reduce the pieces down to seven ?
That's not even 25% of the possible pieces - and they Struggle with that.
8 pieces - more possible positions there than seconds in anybody's lifetime  and how long would that take?  

Want to imagine chess being solved ?
Just say "our supercomputers can do a trillion trillion 'nodes' per second and solve chess and we have a 2-part proof of that ...
1)  "they can do a trillion trillion nodes per second because we don't care about petaflops or Hertz nor how fast nor slow the computer is."
2)  "they can solve that fast - because we say so and we say so because somebody on a website said so."

playerafar


Now maybe this next was said already -
and if I missed it - I won't accept blame because there's a lot of posts in this forum 
If the total number of possible legal chess positions with 7 pieces has been counted up by the computers (not estimated - counted up) -
and these have all been accurately pronounced as 'solved' (barring the castling issue which looks very fishy indeed next to that "10^9 nodes per second" but lets say we let that slide -
and then take that total amount of positions and divide it by the number of seconds (which add up to years) that it took to get from 6 pieces to 7 pieces and we get '10^9 positions per second'  (note they only have 7 pieces) then the whole thing would have some Credibility ??

It wouldn't take long for anybody both watching and thinking to think of ...
"Why Exactly 10^9 positions per second?   Why would it be exactly that and bang on the exact exponent too?   Why wouldn't it vary?   Why wouldn't it slow down as more pieces are added? "
Does one know Snake Oil when its smelled ?  happy.png

Can you solve the opening position in chess at 'one position (node) a second'  ?
If you can't solve one position a second -
how are you going to 'solve' 10^9 positions a second ?

sakkmarton

this is spammmmmm...

tygxc

#1848
Please read the link I provided above. It gives a lot of information on the nodes/s.
"Nodes per second (NPS) is the main measure of chess engines’ speed. It indicates the number of positions the engine considers in one second."
It says 'considers', not 'solves'.
It says cloud engines reach more than 10^9 nodes / second, not exactly 10^9.

playerafar
sakkmarton wrote:

this is spammmmmm...

Hi !
This "10^9 nodes per second" thing has been repeated a zillion times over and over in the forum ...
and however much whoever might want people to read through his website and material ...
it needs to be thoroughly qualified and put in its place - or more likely ...
Refuted.  And conceded to be Invalid.

For now - I've gotten a Concession.  Again.
The concession is that its 'considering' positions. 

If 'nodes' means positions - then why say 'nodes' ?
And how on earth would 'consider' have value if you're not solving?

Suggestion - don't keep repeating over and over that because of '10^9 nodes (or positions) per second' that chess might be 'weakly solved' in five years.'

The number of 'considers' per second wouldn't seem to mean anything substantial at all.
That being true - the number of 'positions' per second and the number of 'nodes' per second would mean even less.

I think there were some admonishments about this in the very early going in the forum.
A poster tried to tell another poster to stop pushing misleading information on the internet.

tygxc

#1848
"If 'nodes' means positions - then why say 'nodes' ?"
"A node, in turn, is a chess position with its evaluation and history, i.e. castling rights, repetition of moves, move turn, etc."
That is not my website.
It is good information.

Elroch

That is a position. It includes everything relevant to the future of the game. A game theorist would refer to it as a state.

It's called a node (eg by stockfish) because in analysis you have a tree of nodes, each of which is a position. This tree stores the relationship between the nodes (a move between them).

playerafar
MARattigan wrote:
tygxc wrote:

#1787
No: read number of positions = time in nanoseconds
When the engine considers 10^9 positions in one second, then it takes 10-9 seconds = 1 nanosecond per position it considers

The engine generates 10^9 positions  in one second without considering them. Considering needs a little longer.

 

tygxc

#1853
No:
"Nodes per second (NPS) is the main measure of chess engines’ speed. It indicates the number of positions the engine considers in one second."
Please see the informative reference link.
It even explains the tradeoff between a simple evaluation and deep search (superior) and a sophisticated evaluation and shallow search (inferior).

playerafar
btickler wrote:

FLOPS (that is, Floating Point Operations Per Second) is the better measure, but any "financier" worth his salt would demand both numbers with supporting documentation.  Number of chess positions evaluated per second is highly variable, and any snake oil salesman could and would present the most "simple" set of nodes as representative of the whole, while also pushing a limited and ultimately threadbare "evaluation" function...between these 2 fudges, you could knock of an order of magnitude.  Which is miniscule in terms of solving chess, but enormous in terms of cost/benefit analysis.

Today, the fastest Supercomputer is 442 PetaFLOPS.  It costs an estimated $1 billion dollars.  Traversing the tree of chess positions and picking the next one, evaluating the position fully, and classifying/storing the result is *at best* in the 10s of operations, but much more likely in the low hundreds of operations.   If you assume that fully evaluating a chess position takes 100 operations, then:

31,536,000 seconds in a year

442 PetaFLOPS = 442000000000000000 operations/second

442000000000000000 / 31,536,000 = 14015728056.8 operations/year

14015728056.8 operations/year / 100 operations per position evaluation = 140157280 positions/ year

10^44.5 positions to be evaluated (Tromp's number, best estimate currently available)

10^44.5 positions / 140157280 positions/year = 2.256235^36 years for the fastest supercomputer to weakly solve chess (almost guaranteed)

80 trillion = estimated worldwide wealth

$80 trillion / $1 billion = 80,000 supercomputers

2.8202937^31 years for 80,000 supercomputers to weakly solve chess (almost guaranteed)

Square root of 10^44.5 = 1.7782794^22

1.7782794^22 / 140157280 = 1.2687742^14 years for the fastest supercomputer to weakly solve chess (if the square root premise is proven to be sufficient, which is not currently the case)

1,585,967,750 years for 80,000 supercomputers to weakly solve chess (if the square root premise is proven to be sufficient, which is not currently the case, and if the entire human race is left to perish and die from lack of resources due the whole world's wealth being spent on the effort)

The *only* way Tygxc's premise works is to chop another 9 orders of magnitude off, *then* take the square root, *then* equate apples to oranges by implying that Stockfish/Sesse positions/second is equivalent to fully evaluating the position at a "tablebase solid" level of precision.

Now there's the 'real' post about this ...
1)  "but any "financier" worth his salt would demand both numbers with supporting documentation. "
2)  "Number of chess positions evaluated per second is highly variable,"
"highly variable"
See it?  And this time its 'evaluated'.
Forum topic: 'solved' 
earlier several times  '10^9 nodes per second computer'
then grudgingly 'considered' 
now we've got 'evaluated'
qualified by 'ultimately threadbare "evaluation" function'   
that's got truth/accuracy/correctness/realism written all over it 
its putting a very nasty interference into 'nodes per second'

a killer though - that puts a torpedo into 'nodes per second' - is 'variable'.
Another is - consider the first opening position - how many 'nodes' per second is that position being 'considered'?  What is 'considered' anyway ?

Regarding taking the square root of the number - that's been discussed many times here.  Its quite a cutdown.  
The bigger the number you're looking at - the more it cuts down the number.
Which begins to discredit such a policy.  
It implies that the more pieces you add to the position - the higher the ratio of irrelevant positions?  But look at how much higher  !
Try considering that for a second ... no pun intended  evil.png

if there's a million positions being considered - then square root would mean 1000 positions.  999,000 positions knocked out ... by formula.
Only one thousandth of the positions left as 'the answer'.
But if we had a trillion positions and do the square root - a million ....
then we've only got a millionth of the positions left ?
Mathematically - yes.
But is that valid for 'solving' purposes.  Even 'weak solving'?

Was @btickler 'generous' in saying a position could be 'fully evaluated' with 100 Ops ?   (there might now be a lot of yelling from somebody else about 'floating point' Ops ...)
If a computer didn't have basic units of speed - if it didn't have discrete 'ops' its doing ...  how would anybody ever build it?  How would it even be designed?  How could you sell the computer to anybody?
If its not doing ops then what is it doing?
Would you say "We don't know how fast or slow it is and we don't care.  Try it yourself and find out."  happy.pnghappy.png

MARattigan
tygxc wrote:

#1853
No:
"Nodes per second (NPS) is the main measure of chess engines’ speed. It indicates the number of positions the engine considers in one second."...

Then the figure is never 10^9 for any engine / hardware combination.