Incidentally, 2. Ba6 losing doesn't consist of quantitative knowledge. That's where you're going wrong. It's purely qualitative, since it loses. It doesn't "probably lose". There are situations which we cannot easily understand but this isn't one of them and your mistake seems to be to wish to apply that same formula to all situations, including those where it's inappropriate, "just to be safe". Really that isn't an epistemological uncertainty but probably an emotional one. If something causes you to invest your beliefs very heavily in that kind of doctrinaire assessment, it probably isn't something you can easily overcome!
"it loses. It doesn't probably lose"
There are two senses a position can lose, empirically and analytically.
Empirically, Ba6 loses (or... probably loses? it's the same thing). You could run billions of high level engine games and you probably wouldn't even get one draw. The reason anyone wins or draws with Ba6 is if their opponent makes serious blunders. Mostly at very low ELO.
Analytically, we don't know. Nobody has convincingly solved chess for Ba6. No amount of empirical evidence will show that Ba6 is losing unless it constitutes an exhaustive search. In all likelihood it is losing. But we do not have certainty either way.
Given the title of the thread is about solving chess, the context here is that we are talking about whether a position is an analytical loss, and not an empirical one.
<<It is also true of the quantification of belief - this just happens to be less familiar to many people.>>
More specifically, it ought to be clear to you that this thing, "the quantification of belief" is an artificial device which has been invented in order to try to make it look as if computers can resemble the human mind.
Nothing to do with computers.
Bayesian probability - the quantification of belief - predates computers - and I explained that computers don't change it, any more than calculators changed arithmetic.
You write:
<<<<<The error that the proverbial "man in the street" would surely make is one that is pragmatically fine for all normal purposes. This is to treat all small probabilities as zero. It's perfectly reasonable to (literally) bet your life on something with very low probability not happening. But some of us understand that it is quantitatively enormously wrong (in a way that those familiar how to quantify how wrong a belief is can see).
To illustrate that last point, suppose someone takes the view that an event that happens 1 in a trillion times is literally impossible. This would imply that they would be willing to stake an unlimited amount against any return on this being so. And they would be willing to do this an unlimited number of times. That is what certainty means quantitatively.>>>>>
Your error is to treat something that is either 0 or 1 as 0.99999999999999,
All uncertain boolean quantities are either 0 or 1. The thing that is not is the appropriate belief about the quantity. This is not a difficult point.
Yes, one person can be inappropriately certain while another is not. That is because their reasoning is incorrect (they likely don't even think about the precise sequence of steps that has led to their belief, even when their attention is drawn to them).