How to detect cheaters in OTB tournaments

Sort:
plutonia
Irontiger wrote:
plutonia wrote:

And by the way, you can buy a handheld metal detector off amazon for 20 pounds.

But then you have to screen your 1200 players before round 1. And obviously, with trained staff. How do you do that fast ?

Fair enough for the rest - but group cheating remains an issue if outsiders can access the playing room.

 

No, it's not an airport that you need to screen everybody before they get on the plane.

The only way to use an electronic device is in the restroom, so you just need 1 arbiter to screen players as they go to the toilet.

 

It is also not needed to screen everybody. They could just screen the people who arise suspicion or just a random sample.

 

I'll say it again: the simple solution is to forbid players from having an electronical device with them. Cheating is clearly tolerated if they allow players to got to the restroom with a smartphone. Heck, how can they even expect people *not* to cheat if they make it so easy?

zBorris

I just read that Borislav Ivanov was banned from his chess federation based on analysis of his moves made in the 9 games he played at the 19th A Open, where he was strip-searched on suspicion of cheating. I know that some games or positions can lead to a lot of forcing moves that will create a higher agreement, or they can be a lot of positional moves that will create a lower agreement but without any serious errors. So I was curious how close he was in agreement to Houdini in relation to his opponents, and how many errors that he had made in the process.

I did it this way that no matter what type of opening was played, his scores would be seen through his opponent and their errors, not just simply based on computer agreement.

I ran Houdini 1.5, which is a free download, for at least 30 seconds per move, but not more than 60 seconds at a depth of 18 ply. I used a temporary shareware program called ChessAnalyse 2.6 that can be downloaded for free through Chess.com's list of downloads. I asked the output to include whether or not the choice that the player made was among the top 3 candidates for Houdini. If an error was made, then I asked for a report that allowed me to see how great an error it was. For example, in some positional games, you might choose a move that isn't among the top 3 Houdini candidates, but only because the position offers more than 3 candidates each of which are near equivalents to the engine-choice. So it's good to determine whether a player had a "style choice" or made a mistake.

What I found is that not only did Borislav Ivanov have an extremely high engine agreement - significantly higher than his opponents - but he also made virtually no errors in moves that did not agree with the engine preference. 

  • The average ELO of his opponents was 2578.4.
  • The average top choice agreement of his opponents was 48.5%.
  • The average top 3 choice agreeement of his oppontns was 83.4%.
  • The total number of errors his opponents made was 29 out of 9 games.

 

  • Ivanov's ELO was 2227.
  • His average top choice was 66.72.
  • His average top 3 choice was 93.84.
  • He made 8 errors out of 9 games with 4 "perfect" games.

Errors means mistakes, he or his opponents were allowed to pick a candidate that falls outside top 3 choices of engine candidates if it wasn't significant in changing the numerical assessment of the position - such as would occur in closed positional type games.

 

Opponents ELO Top1 Top3 error
Game 1 2426 41.2 73.5 4
Game 5 2569 45.2 80.6 7
Game 4 2561 58.8 91.2 0
Game 3 2565 41.2 76.5 3
Game 2 2583 60.6 94.9 7
Game 8 2600 50 91.2 2
Game 9 2626 45 80 3
Game 6 2638 44.1 79.4 2
Game 7 2638 50 83.3 1
Average 2578.4 48.5 83.4 29
         
Ivanov ELO Top1 Top 3 error
Game 1 2227 68.6 100 3
Game 5 2227 68.8 96.9 0
Game 4 2227 63.3 87.9 0
Game 3 2227 85.7 97.1 0
Game 2 2227 67.7 99 1
Game 8 2227 52.9 79.4 2
Game 9 2227 60 97.5 1
Game 6 2227 67.6 94.1 0
Game 7 2227 65.9 92.7 1
Average 2227 66.72 93.84 8