Does True Randomness Actually Exist? ( ^&*#^%$&#% )

Sort:
IJELLYBEANS
twosugarspleasebono wrote:

does happiness exist ? not in and of itself. it takes great restraint to be happy, lose 20 chess games in a row and still be beaming from ear to ear.

 

I discern what you did there. By utilising the word ‘beam,’ you are alluding to the Afshar experiment, the double quantum eraser experiment and a nimiety of other things for which I shall not enumerate. Now we must relate these quantum mechanical delicacies to the intrinsic value of happiness, and long live the gracious grand unified theory of everything. General relativity would prove to be great glory as it does not transcend our daily perceptions all too much, yet it would be dwindling in popularity. Meanwhile, Aristotle would be yammering on about how the universe was not like a clockwork and be introducing the notion of pseudorandom interpretations from the people of his time. Philosophy would be almost a cliche in this hypothetical universe, leading to a mass of people to instantaneously forget what they were digressing on about, which is the case with me. Except philosophy is completely beyond me and to illustrate that, I have devised a song. 
philosophy is completely beyond me.
Quantum mechanical universe brings us such glee

For pessimists, it will Outline their lack of sagacity

But I am the one optimist who has attained complete stupidity

 

IJELLYBEANS
Uke8 wrote:
Elroch wrote:
Uke8 wrote:

statistically If you throw a dice 12 million times it will fall 2m times on each # right?

No, actually not. It is likely that the number of times it comes up 6 (or any other number) is NEAR 2,000,000. It is quite unlikely to be more than a few thousand away from 2,000,000. You can work out exactly how unlikely. For illustration, I have done your experiment 1,000 times (with perfect digital dice) and here is a graph of the number of 6s

 

You will see that it was usually within a couple of thousand of 2,000,000 sixes and very rarely more than 2004000 or less than 1996000. With a lot more than 1,000 runs you would get occasional runs getting results further than this from 2,000,000, with increasingly low probability. (The histogram counts results in ranges of about 1000)

The fact that the spread is quite small compared to the mean is related to the most important truth about randomness, the law of large numbers.

so how exactly is this random? wouldn't you expect a random spread?

See above graph! Probability theory tells you how random it is.

and if random is just an illusion, does it mean that every game of chess is already determined before it even start? consulting with google was surly not a random decision, lol. here are my finding:

1.Math and the art of describing randomness


its cool that you did that. however, your chart show an error margin of what? less then 1% or so? either way its very close to 2M each.
what puzzle me is why dont you get a real random and chaotic spread? like 500K at one experiment and 5M at another? how come the margin is so small?
explaining this with the phenomena of the large numbers doesnt really help, because its not a real explanation and equal to asking me for acceptance of the phenomena without understanding the 'Why' or the 'How' (reminds me the old qm saying... 'shut up and calculate')


The shut up and calculate phrase is so aurally pleasing to one’s ears; in fact, it damaged my cochlea so austerely that I could merely hearken to Beethoven’s 7th Symphony, 2nd movement allegretto. I can just imagine Richard Feynman banging the bongo drums with the same beat as the musical masterpiece, in Copenhagen. His bongo drums would be inscribed all over with an untidy scrawl of drawings of roses, the pristine sinusoid curves and mathematical equations. Perhaps there would even be a fine piece of evidence that he had victories valiantly in the Putnam math competition. But of course, this is merely to be taken into consideration if he were egotistical enough.

 

IJELLYBEANS
DifferentialGalois wrote:
Uke8 wrote:
Elroch wrote:
Uke8 wrote:

statistically If you throw a dice 12 million times it will fall 2m times on each # right?

No, actually not. It is likely that the number of times it comes up 6 (or any other number) is NEAR 2,000,000. It is quite unlikely to be more than a few thousand away from 2,000,000. You can work out exactly how unlikely. For illustration, I have done your experiment 1,000 times (with perfect digital dice) and here is a graph of the number of 6s

 

You will see that it was usually within a couple of thousand of 2,000,000 sixes and very rarely more than 2004000 or less than 1996000. With a lot more than 1,000 runs you would get occasional runs getting results further than this from 2,000,000, with increasingly low probability. (The histogram counts results in ranges of about 1000)

The fact that the spread is quite small compared to the mean is related to the most important truth about randomness, the law of large numbers.

so how exactly is this random? wouldn't you expect a random spread?

See above graph! Probability theory tells you how random it is.

and if random is just an illusion, does it mean that every game of chess is already determined before it even start? consulting with google was surly not a random decision, lol. here are my finding:

1.Math and the art of describing randomness


its cool that you did that. however, your chart show an error margin of what? less then 1% or so? either way its very close to 2M each.
what puzzle me is why dont you get a real random and chaotic spread? like 500K at one experiment and 5M at another? how come the margin is so small?
explaining this with the phenomena of the large numbers doesnt really help, because its not a real explanation and equal to asking me for acceptance of the phenomena without understanding the 'Why' or the 'How' (reminds me the old qm saying... 'shut up and calculate')

 

The phrase ‘phenomena of the large numbers’ reminds me of Dirac’s large numbers hypothesis. I sincerely apologise if someone has already mentioned this. It happens to be that Paul Dirac was obsessed with large dimensionless physical constants and as such, had an influence on fine tuning universe aspects and a multitude of theological and physics aspects. After toying around with these constants, Dirac found that the gravitational constant varied across time. Toying around in the pure sciences is profoundly important, as many mathematical Olympiad contestants will know. By pure sciences, we establish mathematics as being at the top of this hierarchy, because of Gauss’ universally known quote that math was the queen of all sciences. What is the king of all sciences? How about the bishop, the knight or the rook? Well, I discuss that partially in my profile.

Anyway, this Note additionally reminds me of Benford’s law, something that data scientists are wrapping their prefrontal cortex around. It can be utilised as a means of teaching middle school or high school students about the base 10 logarithms and its idiosyncratic but intriguing applications. Because my speech has already digressed, I will leave this link for the inquisitive reader:

https://brilliant.org/wiki/benfords-law/

 

 

2bz

"lol.. beautiful mouth birdie, not bad on short notice."
...not a selfie tongue.png

"I can just imagine Richard Feynman banging the bongo drums with the same beat as the musical masterpiece, in Copenhagen."

Uke8
Elroch wrote:
Uke8 wrote:

its cool that you did that. however, your chart show an error margin of what? less then 1% or so? either way its very close to 2M each.
what puzzle me is why dont you get a real random and chaotic spread? like 500K at one experiment and 5M at another? how come the margin is so small?
explaining this with the phenomena of the large numbers doesnt really help, because its not a real explanation and equal to asking me for acceptance of the phenomena without understanding the 'Why' or the 'How' (reminds me the old qm saying... 'shut up and calculate')

Well, it comes down to counting the possibilities.

It is clearer with flipping a coin (because it has only 2 sides, not 6).

Flip a coin once and half the time you get 1 head, half the time 1 tail. Average fraction of heads is 1/2 but there is a lot of scatter. Every result is quite a way from the expected result of half a head!

Flip it twice and it is a bit different. 1/4 of the time you get 2 heads, 1/4 of the time 2 tails and 1/2  the time you get a head and a tail (in two different orders). The average proportion of heads remains 1/2, but now half the results are spot on (50/50) unlike with just one flip.

Now  flip a coin 4 times. This gives 16 possible results (as sequences of heads and tails)

1/16  4 heads,  1/16 4 tails    These extreme results are becoming rare.

4/16 3 heads and 1 tail,   4/16 3 tails and 1 head     Here the proportions are 3/4 or 1/4 instead of a half, which is only half the maximum variation, so not the expected proportion, but less far from it.

6/16 2 heads and 2 tails     The proportion is exactly 1/2 in this case.

Here the proportion of heads is getting more concentrated near the expected proportion.

 

The big idea here is that the larger the number of coins you toss, the more ways there are to get a result that is near to the expected proportion. Proportions a long way from 1/2 have a relatively small number of orders of heads and tails that can make them. It's a mathematical fact you can verify by counting.

So it all comes down to counting.

[After writing that quick account I looked for a better one. Here is a pretty good example from Sheffield University].

actually this is an excellent answer with insight to the mechanics of randomness.
looking at random from a different angle now, its basically just a 'choice' between several possibilities, and not to be speculative the 'choice' may be either deterministic or chaotic (for lack of better word)
that also explain my dilemma about the proportions because if i go back to the coin flips, the 'choice' is govern by the laws of physics and given both obverse and reverse are identical, it just make sense for the proportions to be as well.

going back to a single particle... i cant think of any 'mechanism' that will cheat some laws of physics and make a chaotic choice.

look, determinism sux. no one wants it. well, maybe a convict does, but that a different conversation, so maybe there's another way out of it without relaying on true randomness. maybe human behavior like was suggested here.. lol

Uke8
Thee_Ghostess_Lola wrote:

and for all u coin spinners ?....try this one !

Spinning, rather than flipping, an old penny will land on heads something like 80 percent of the time. Lincoln's head is heavier than the Lincoln memorial on the reverse, which leaves tails facing up more often than not.

if you ever find a 1943 red penny save it because it worth several $100,000's . (in 43 they used all the copper for the war and made it from some white metal instead)

Uke8

'The phrase ‘phenomena of the large numbers’ reminds me of Dirac’s large numbers hypothesis.'

 

any idea for the reasoning? It is mentioned here... 

https://www.youtube.com/watch?v=j1dKvoa2ITw

 

 

Thee_Ghostess_Lola

true randomness doesn't exist in numbers. so u can throw using theoretical math out the window when trying to explain nature. it'll just be a false positive.

they say thats why roulette is such a dangerous game for casinos. so they hellalimit making bets directly to the #. its all Norman Leigh's fault (France 1966). lol !

KingAxelson

@ Uke.. Are you surprised that the subject of astronomy has not come out yet? Talk about randomness, that realm is filled with it. Or so it seems.

Anyhow, here is another strange observation of mine. And perhaps what makes it so strange, is that I’ve only noticed it in the past couple of years.

By my calculations, eight times out of ten, I’m the first in line at a stop light. Meaning before it turns green, and the person next to me almost always waits for me to go first. 

It doesn't matter if I’m driving my personal car, or a company semi. Same result, being local or on the road. Always doing the speed limit. Almost wish I could run some kind of experiment on this type of thing. Much like Elroch’s digital dice.

 

 

Elroch

Record a good sample of empirical data. It is very easy for an impression to be wrong.

IJELLYBEANS
Uke8 wrote:

'The phrase ‘phenomena of the large numbers’ reminds me of Dirac’s large numbers hypothesis.'

 

any idea for the reasoning? It is mentioned here... 

https://www.youtube.com/watch?v=j1dKvoa2ITw

 

 


If you seek the insight behind the key aspect as to why G varies with time, then I recommend the answer displayed on Physics Stack Exchange:

https://physics.stackexchange.com/questions/500308/what-is-the-justification-for-diracs-large-numbers-hypothesis

There aren’t exactly a nimiety of websites dedicated to the hypothesis contrary to the aforementioned probability theory (I suppose it forms an integral part of QM). Yet, I entrust that you will find a decent selection.

Sillver1

that also explain my dilemma about the proportions because if i go back to the coin flips, the 'choice' is govern by the laws of physics and given both obverse and reverse are identical, it just make sense for the proportions to be as well.

but there's nothing truly random about coins obeying the laws of physics. would you expect identical proportions if that scenario was absolutely random?

going back to a single particle... i cant think of any 'mechanism' that will cheat some laws of physics and make a chaotic choice.

Maybe your single particle is just confused

'look, determinism sux. no one wants it. well, maybe a convict does, but that a different conversation'

why would a convict want determinism to be true?

Sillver1

'...and the person next to me almost always waits for me to go first.'

sounds like a reason for concern. lol.

I heard that Oregonians drive super politely. is that true?

Elroch

The relationship between probabilities and the real world is an (irresolvable) philosophical problem. In terms of Frequentist probability, there are no long runs in the real Universe, just finite ones (at least at any point in time) so no long run statistics ever actually apply.

When you try to see how closely a probabilistic prediction will approximate the real data, you find the meaning comes down to things like there is a certain probability of the real result being within a certain range of the probabilistic result. Then you need to interpret the meaning of that new probability. That leads you to a third one which needs interpreting and so on!

In terms of Bayesian probability, things are a bit simpler. You quantify beliefs and the way in which these beliefs change is uniquely determined by a small set of assumptions which are pretty inarguable. But this too provides no guarantees about what will happen, just that, in a sense, your quantitative beliefs are consistent and rational in the way they take account of information.

One very old idea is that when you don't know what is true, you make as neutral assumptions as possible. So if you know that if a die can give you 6 results and you know nothing else, the best assumption is that all 6 possibilities are equally likely. However, when you move on to taking into account empirical data to revise your model of reality, this fails to be adequate.

Say for a coin (to simplify), if all flips of the coin are assumed to be independent, then there are a range of possibilities, each given by the probability that the coin will come up heads. Because there are only 2 mutually exclusive possibilities, the proportion of heads tells you absolutely everything. The problem is that you need to start with some belief of how likely all these possibilities are (to be revised later by empirical data).

For example, you might say there is 99% chance that a coin is fair, and 1% chance that it is biased, and then you need to pick some way to distribute that 1% across the other possibilities. Your choice. This is the hard truth of picking a prior in Bayesian statistics: there is no right answer. If you start by being really suspicious, you might start with the belief that there was 50% chance of the coin being fair, and the other 50% might be mostly concentrated at high probabilities of heads or high probabilities of tails. 

Given one of these starting points, empirical data revises your beliefs by being more consistent with some of the possibilities than others. Bayes rule is how you do it. The nice thing is that whatever (reasonable) prior you start with, when you have a large enough amount of data, your final beliefs are pretty close to the same. This is the way empirical data overwhelms the prior in the end. Good news for anyone who wants to make sense of the world.

Uke8

@king, astronomy or astrology ;? here's that eagle carrying a fish that appears like a monkey's head i told you about.. its not edited, just still images from a video. (the talons combines with the fish gives that random illusion of a head)



Uke8
DifferentialGalois wrote:
Uke8 wrote:

'The phrase ‘phenomena of the large numbers’ reminds me of Dirac’s large numbers hypothesis.'

 

any idea for the reasoning? It is mentioned here... 

https://www.youtube.com/watch?v=j1dKvoa2ITw

 

 


If you seek the insight behind the key aspect as to why G varies with time, then I recommend the answer displayed on Physics Stack Exchange:

https://physics.stackexchange.com/questions/500308/what-is-the-justification-for-diracs-large-numbers-hypothesis

There aren’t exactly a nimiety of websites dedicated to the hypothesis contrary to the aforementioned probability theory (I suppose it forms an integral part of QM). Yet, I entrust that you will find a decent selection.

sorry, i totaly messed this one up. was thinking of the benford law and meant to ask if you know the reason for why this pattern happen?

Uke8
Sillver1 wrote:

that also explain my dilemma about the proportions because if i go back to the coin flips, the 'choice' is govern by the laws of physics and given both obverse and reverse are identical, it just make sense for the proportions to be as well.

but there's nothing truly random about coins obeying the laws of physics. would you expect identical proportions if that scenario was absolutely random?

I cant wrap my head around this one either, and i tried. on one hand i want to say that even proportions makes sense and that's final answer. on the other hand im asking myself why does it make sense? and myself answer... 'because everything empirical indicate an even spread of the proportions'. however, i reply... if true randomness dont exist, how would i ever know what real random proportions would be? is it possible that this 'making sense' or 'intuition' is just false and misleading? myself didnt reply to that yet. sounds loopy or what?

going back to a single particle... i cant think of any 'mechanism' that will cheat some laws of physics and make a chaotic choice.

Maybe your single particle is just confused

or maybe you're just projecting on that single lone particle : )

'look, determinism sux. no one wants it. well, maybe a convict does, but that a different conversation'

why would a convict want determinism to be true?

because no one could be blamed for wrong doings. it would make all our actions inevitable?

 

Uke8
Elroch wrote:

One very old idea is that when you don't know what is true, you make as neutral assumptions as possible. So if you know that if a die can give you 6 results and you know nothing else, the best assumption is that all 6 possibilities are equally likely. However, when you move on to taking into account empirical data to revise your model of reality, this fails to be adequate.

why do you assume that all possibilities are equally likely? I'm not referring to a physical bias to the die which obey the law of physics. (or a bias to empirical data results from pseudorandom) instead, assume an absolute random... why do we take for granted an equal distribution? (large numbers included)

KingAxelson
Elroch wrote:

Record a good sample of empirical data. It is very easy for an impression to be wrong.

Yes of course, rationality usually wins the day in the end. And I prefer that course as well. Still, challenging perceptions is a dominant pastime of mine.

Take for instance our friend Silver. He thinks it’s polite for the guy waiting at the green light with me to chill until I drive away first. Oddly enough, I never thought of it that way. My perception is that they just don’t want to take a leadership role. I’m offended by that conclusion. So until my perception can change on the matter, I’m somewhat stuck. 

And so I’ve relegated the issue a random value. When people mill around in crowded places, it’s much the same thing to me. (Meaning walking around at the grocery store etc..) There are those that will crowd you, and those that will give you space. That reflection makes sense to me, the transference is damn near the same. 

Thee_Ghostess_Lola

why would a convict want determinism to be true?

because no one could be blamed for wrong doings. it would make all our actions inevitable?

 

...enter free will and how it all ties in w/ emotion.