
Does True Randomness Actually Exist? ( ^&*#^%$&#% )
statistically If you throw a dice 12 million times it will fall 2m times on each # right?
No, actually not. It is likely that the number of times it comes up 6 (or any other number) is NEAR 2,000,000. It is quite unlikely to be more than a few thousand away from 2,000,000. You can work out exactly how unlikely. For illustration, I have done your experiment 1,000 times (with perfect digital dice) and here is a graph of the number of 6s

You will see that it was usually within a couple of thousand of 2,000,000 sixes and very rarely more than 2004000 or less than 1996000. With a lot more than 1,000 runs you would get occasional runs getting results further than this from 2,000,000, with increasingly low probability. (The histogram counts results in ranges of about 1000)
The fact that the spread is quite small compared to the mean is related to the most important truth about randomness, the law of large numbers.
so how exactly is this random? wouldn't you expect a random spread?
See above graph! Probability theory tells you how random it is.
and if random is just an illusion, does it mean that every game of chess is already determined before it even start? consulting with google was surly not a random decision, lol. here are my finding:
1.Math and the art of describing randomness
its cool that you did that. however, your chart show an error margin of what? less then 1% or so? either way its very close to 2M each.
what puzzle me is why dont you get a real random and chaotic spread? like 500K at one experiment and 5M at another? how come the margin is so small?
explaining this with the phenomena of the large numbers doesnt really help, because its not a real explanation and equal to asking me for acceptance of the phenomena without understanding the 'Why' or the 'How' (reminds me the old qm saying... 'shut up and calculate')
Yes, true randomness exists
But the only case where true randomness lies is quantum mechanics.
Basically the state of a tiny tiny particle will always change randomly.
everyone keep saying that. and its ok with me. however, i ask you... what was it that convinced you that this is really the truth?
quote:
"Although quantum mechanics has held up to rigorous and extremely precise tests in an extraordinarily broad range of experiments (not one prediction from quantum mechanics is found to be contradicted by experiments), there exist a number of contending schools of thought over their interpretation. These views on interpretation differ on such fundamental questions as whether quantum mechanics is deterministic or random, which elements of quantum mechanics can be considered "real", and what is the nature of measurement, among other matters."
https://en.wikipedia.org/wiki/Interpretations_of_quantum_mechanics
its cool that you did that. however, your chart show an error margin of what? less then 1% or so? either way its very close to 2M each.
what puzzle me is why dont you get a real random and chaotic spread? like 500K at one experiment and 5M at another? how come the margin is so small?
explaining this with the phenomena of the large numbers doesnt really help, because its not a real explanation and equal to asking me for acceptance of the phenomena without understanding the 'Why' or the 'How' (reminds me the old qm saying... 'shut up and calculate')
Well, it comes down to counting the possibilities.
It is clearer with flipping a coin (because it has only 2 sides, not 6).
Flip a coin once and half the time you get 1 head, half the time 1 tail. Average fraction of heads is 1/2 but there is a lot of scatter. Every result is quite a way from the expected result of half a head!
Flip it twice and it is a bit different. 1/4 of the time you get 2 heads, 1/4 of the time 2 tails and 1/2 the time you get a head and a tail (in two different orders). The average proportion of heads remains 1/2, but now half the results are spot on (50/50) unlike with just one flip.
Now flip a coin 4 times. This gives 16 possible results (as sequences of heads and tails)
1/16 4 heads, 1/16 4 tails These extreme results are becoming rare.
4/16 3 heads and 1 tail, 4/16 3 tails and 1 head Here the proportions are 3/4 or 1/4 instead of a half, which is only half the maximum variation, so not the expected proportion, but less far from it.
6/16 2 heads and 2 tails The proportion is exactly 1/2 in this case.
Here the proportion of heads is getting more concentrated near the expected proportion.
The big idea here is that the larger the number of coins you toss, the more ways there are to get a result that is near to the expected proportion. Proportions a long way from 1/2 have a relatively small number of orders of heads and tails that can make them. It's a mathematical fact you can verify by counting.
So it all comes down to counting.
[After writing that quick account I looked for a better one. Here is a pretty good example from Sheffield University].
the quintessential example illustrated with your dice roll is pseudorandom as it can depend on a good few other factors such as viscosity of the air, friction of the arbitrary surface, the gravitational force, etc.
so are u saying that its more bayesian-based ?....yet dice gambling well-n-overcomes this cuz casinos arent gonna enter into a proposition w/out getting the best side of the equation, right ? as its very hard to bet against 2-dice being rolled under-7 if its already done it 5 straight times. tho its not very hard to argue for an O-7 coming out if ur a frequentist (which i personally feel is fraught w/ error....which creates the true blue separator).
oh well. whaddu i know ?....outside that i dont trust math theory.
and for all u coin spinners ?....try this one !
Spinning, rather than flipping, an old penny will land on heads something like 80 percent of the time. Lincoln's head is heavier than the Lincoln memorial on the reverse, which leaves tails facing up more often than not.
Randomness isn't an object, it is an idea. It is also an idea with no bearing or authority onto anything else. It is a noun in the eye of a beholder, it is absolutely dependent on perspective and because of that, it is at best, a consideration.
Believing in randomness is very low elo.
....so true
. evo'ists shout from their stump all things unintelligent design. yay !.....youve just admitted to a designer !
does happiness exist ? not in and of itself. it takes great restraint to be happy, lose 20 chess games in a row and still be beaming from ear to ear.
I discern what you did there. By utilising the word ‘beam,’ you are alluding to the Afshar experiment, the double quantum eraser experiment and a nimiety of other things for which I shall not enumerate. Now we must relate these quantum mechanical delicacies to the intrinsic value of happiness, and long live the gracious grand unified theory of everything. General relativity would prove to be great glory as it does not transcend our daily perceptions all too much, yet it would be dwindling in popularity. Meanwhile, Aristotle would be yammering on about how the universe was not like a clockwork and be introducing the notion of pseudorandom interpretations from the people of his time. Philosophy would be almost a cliche in this hypothetical universe, leading to a mass of people to instantaneously forget what they were digressing on about, which is the case with me. Except philosophy is completely beyond me and to illustrate that, I have devised a song.
philosophy is completely beyond me.
Quantum mechanical universe brings us such glee
For pessimists, it will Outline their lack of sagacity
But I am the one optimist who has attained complete stupidity
statistically If you throw a dice 12 million times it will fall 2m times on each # right?
No, actually not. It is likely that the number of times it comes up 6 (or any other number) is NEAR 2,000,000. It is quite unlikely to be more than a few thousand away from 2,000,000. You can work out exactly how unlikely. For illustration, I have done your experiment 1,000 times (with perfect digital dice) and here is a graph of the number of 6s
You will see that it was usually within a couple of thousand of 2,000,000 sixes and very rarely more than 2004000 or less than 1996000. With a lot more than 1,000 runs you would get occasional runs getting results further than this from 2,000,000, with increasingly low probability. (The histogram counts results in ranges of about 1000)
The fact that the spread is quite small compared to the mean is related to the most important truth about randomness, the law of large numbers.
so how exactly is this random? wouldn't you expect a random spread?
See above graph! Probability theory tells you how random it is.
and if random is just an illusion, does it mean that every game of chess is already determined before it even start? consulting with google was surly not a random decision, lol. here are my finding:
1.Math and the art of describing randomness
its cool that you did that. however, your chart show an error margin of what? less then 1% or so? either way its very close to 2M each.
what puzzle me is why dont you get a real random and chaotic spread? like 500K at one experiment and 5M at another? how come the margin is so small?
explaining this with the phenomena of the large numbers doesnt really help, because its not a real explanation and equal to asking me for acceptance of the phenomena without understanding the 'Why' or the 'How' (reminds me the old qm saying... 'shut up and calculate')
The shut up and calculate phrase is so aurally pleasing to one’s ears; in fact, it damaged my cochlea so austerely that I could merely hearken to Beethoven’s 7th Symphony, 2nd movement allegretto. I can just imagine Richard Feynman banging the bongo drums with the same beat as the musical masterpiece, in Copenhagen. His bongo drums would be inscribed all over with an untidy scrawl of drawings of roses, the pristine sinusoid curves and mathematical equations. Perhaps there would even be a fine piece of evidence that he had victories valiantly in the Putnam math competition. But of course, this is merely to be taken into consideration if he were egotistical enough.
statistically If you throw a dice 12 million times it will fall 2m times on each # right?
No, actually not. It is likely that the number of times it comes up 6 (or any other number) is NEAR 2,000,000. It is quite unlikely to be more than a few thousand away from 2,000,000. You can work out exactly how unlikely. For illustration, I have done your experiment 1,000 times (with perfect digital dice) and here is a graph of the number of 6s
You will see that it was usually within a couple of thousand of 2,000,000 sixes and very rarely more than 2004000 or less than 1996000. With a lot more than 1,000 runs you would get occasional runs getting results further than this from 2,000,000, with increasingly low probability. (The histogram counts results in ranges of about 1000)
The fact that the spread is quite small compared to the mean is related to the most important truth about randomness, the law of large numbers.
so how exactly is this random? wouldn't you expect a random spread?
See above graph! Probability theory tells you how random it is.
and if random is just an illusion, does it mean that every game of chess is already determined before it even start? consulting with google was surly not a random decision, lol. here are my finding:
1.Math and the art of describing randomness
its cool that you did that. however, your chart show an error margin of what? less then 1% or so? either way its very close to 2M each.
what puzzle me is why dont you get a real random and chaotic spread? like 500K at one experiment and 5M at another? how come the margin is so small?
explaining this with the phenomena of the large numbers doesnt really help, because its not a real explanation and equal to asking me for acceptance of the phenomena without understanding the 'Why' or the 'How' (reminds me the old qm saying... 'shut up and calculate')
The phrase ‘phenomena of the large numbers’ reminds me of Dirac’s large numbers hypothesis. I sincerely apologise if someone has already mentioned this. It happens to be that Paul Dirac was obsessed with large dimensionless physical constants and as such, had an influence on fine tuning universe aspects and a multitude of theological and physics aspects. After toying around with these constants, Dirac found that the gravitational constant varied across time. Toying around in the pure sciences is profoundly important, as many mathematical Olympiad contestants will know. By pure sciences, we establish mathematics as being at the top of this hierarchy, because of Gauss’ universally known quote that math was the queen of all sciences. What is the king of all sciences? How about the bishop, the knight or the rook? Well, I discuss that partially in my profile.
Anyway, this Note additionally reminds me of Benford’s law, something that data scientists are wrapping their prefrontal cortex around. It can be utilised as a means of teaching middle school or high school students about the base 10 logarithms and its idiosyncratic but intriguing applications. Because my speech has already digressed, I will leave this link for the inquisitive reader:
https://brilliant.org/wiki/benfords-law/
"lol.. beautiful mouth birdie, not bad on short notice."
...not a selfie ![]()
"I can just imagine Richard Feynman banging the bongo drums with the same beat as the musical masterpiece, in Copenhagen."
its cool that you did that. however, your chart show an error margin of what? less then 1% or so? either way its very close to 2M each.
what puzzle me is why dont you get a real random and chaotic spread? like 500K at one experiment and 5M at another? how come the margin is so small?
explaining this with the phenomena of the large numbers doesnt really help, because its not a real explanation and equal to asking me for acceptance of the phenomena without understanding the 'Why' or the 'How' (reminds me the old qm saying... 'shut up and calculate')
Well, it comes down to counting the possibilities.
It is clearer with flipping a coin (because it has only 2 sides, not 6).
Flip a coin once and half the time you get 1 head, half the time 1 tail. Average fraction of heads is 1/2 but there is a lot of scatter. Every result is quite a way from the expected result of half a head!
Flip it twice and it is a bit different. 1/4 of the time you get 2 heads, 1/4 of the time 2 tails and 1/2 the time you get a head and a tail (in two different orders). The average proportion of heads remains 1/2, but now half the results are spot on (50/50) unlike with just one flip.
Now flip a coin 4 times. This gives 16 possible results (as sequences of heads and tails)
1/16 4 heads, 1/16 4 tails These extreme results are becoming rare.
4/16 3 heads and 1 tail, 4/16 3 tails and 1 head Here the proportions are 3/4 or 1/4 instead of a half, which is only half the maximum variation, so not the expected proportion, but less far from it.
6/16 2 heads and 2 tails The proportion is exactly 1/2 in this case.
Here the proportion of heads is getting more concentrated near the expected proportion.
The big idea here is that the larger the number of coins you toss, the more ways there are to get a result that is near to the expected proportion. Proportions a long way from 1/2 have a relatively small number of orders of heads and tails that can make them. It's a mathematical fact you can verify by counting.
So it all comes down to counting.
[After writing that quick account I looked for a better one. Here is a pretty good example from Sheffield University].
actually this is an excellent answer with insight to the mechanics of randomness.
looking at random from a different angle now, its basically just a 'choice' between several possibilities, and not to be speculative the 'choice' may be either deterministic or chaotic (for lack of better word)
that also explain my dilemma about the proportions because if i go back to the coin flips, the 'choice' is govern by the laws of physics and given both obverse and reverse are identical, it just make sense for the proportions to be as well.
going back to a single particle... i cant think of any 'mechanism' that will cheat some laws of physics and make a chaotic choice.
look, determinism sux. no one wants it. well, maybe a convict does, but that a different conversation, so maybe there's another way out of it without relaying on true randomness. maybe human behavior like was suggested here.. lol
and for all u coin spinners ?....try this one !
Spinning, rather than flipping, an old penny will land on heads something like 80 percent of the time. Lincoln's head is heavier than the Lincoln memorial on the reverse, which leaves tails facing up more often than not.
if you ever find a 1943 red penny save it because it worth several $100,000's . (in 43 they used all the copper for the war and made it from some white metal instead)
'The phrase ‘phenomena of the large numbers’ reminds me of Dirac’s large numbers hypothesis.'
any idea for the reasoning? It is mentioned here...
https://www.youtube.com/watch?v=j1dKvoa2ITw
true randomness doesn't exist in numbers. so u can throw using theoretical math out the window when trying to explain nature. it'll just be a false positive.
they say thats why roulette is such a dangerous game for casinos. so they hellalimit making bets directly to the #. its all Norman Leigh's fault (France 1966). lol !


statistically If you throw a dice 12 million times it will fall 2m times on each # right? so how exactly is this random? wouldn't you expect a random spread?
and if random is just an illusion, does it mean that every game of chess is already determined before it even start? consulting with google was surly not a random decision, lol. here are my finding:
1.Math and the art of describing randomness
https://www.youtube.com/watch?v=j1dKvoa2ITw
2.When I’m bored I text a random number “I hid the body… now what”
3.I was talking to my friends and they said a random topic about cats and I’m like “Water you talking about”
4.Randomness is a reflection of our ignorance about the thing being observed
rather than something inherent to it.
I'm confused!
The vast majority of events that play prevalence in our lives can be envisioned as pseudorandom. For instance, the quintessential example illustrated with your dice roll is pseudorandom as it can depend on a good few other factors such as viscosity of the air, friction of the arbitrary surface, the gravitational force, etc. Obviously, this is true. If you persist to think about it, a dice roll is only “coerced” into some turmoil if it extends over an infinite period of time, which of course is impractical.