You can see how useful numpy is, Ghostess? To do 200,000,000 trials per second (on my old i5) rather than in 12 minutes! The key is to do everything with arrays rather than individual samples.
Does True Randomness Actually Exist? ( ^&*#^%$&#% )

@elroch, maybe you want to explain them that bell disproved Only local hidden variables proposed by Einstein, and by no mean disprove determinism? I'm not too hot about a thread of misinformation under my skirt.
You need to break causality in order to try to revive determinism. If you do that, there is no order of time, so nothing makes any sense. Far better to accept that determinism is dead in our Universe.

my return on 1BB flips from the code abv...
Counter({'Tails': 219860894, 'Heads': 219860894})
Counter({'Tails': 500027761, 'Heads': 499972239})
3789 seconds
so parity stopped after abt 220MM flips and tails took it home from there...yay !

You can see how useful numpy is, Ghostess? To do 200,000,000 trials per second (on my old i5) rather than in 12 minutes! The key is to do everything with arrays rather than individual samples.
i went w/ a object list & dict.

Using numpy is good. A billion flips and analysis took 25 seconds. (With this number the array of flips is about 4 Gb in RAM).
Here is another view of the cumulative differences in the numbers of heads and tails for a billion samples.
Here the y-axis is scaled by sqrt(number of samples). That is because its standard deviation is proportional to that.
The x-axis is a log scale of the number of samples, because things like the number of crossings go up like this scale.
@elroch, maybe you want to explain them that bell disproved Only local hidden variables proposed by Einstein, and by no mean disprove determinism? I'm not too hot about a thread of misinformation under my skirt.
You need to break causality in order to try to revive determinism. If you do that, there is no order of time, so nothing makes any sense. Far better to accept that determinism is dead in our Universe.
Really? still obfuscating? I’m not mad, I’m just disappointed. think it’s best we separate our own ways.
To be fair, maybe you or optimissed can start your own determinism thread and have everyone follow you there.
Thank you everyone for participating. this thread is officially close.
Be well and keep it real

Not really. Python is (fairly) slow for sequential operations. However, vectorising your code using numpy means it is quite fast.
For example, tossing a million coins a hundred times, and print out the difference between the number of heads and the number of tails in each of the 100 trials.
This code takes just over half a second.
import numpy as np
from time import time
start = time()
x = np.random.randint(2, size=(1000000,100))
diffs = 2*np.sum(x, axis=0)-1000000
print(means)
print("That took", time()-start, "seconds")
Having noticed Ghostess post above, here is the vectorised way to do that once, also taking about half a second.
import numpy as np
from time import time
start = time()
x = np.random.randint(2, size=100000000)
diff = 2*np.sum(x)-len(x)
print("Surplus heads =", diff,"taking", time()-start, "seconds")
That's kind of the point, though. You wrote that code in a way that compiles and/or interprets faster, and that code snippet works largely unaltered in many languages. But your average Python programmer isn't going to write it that way.

Quantum mechanics is not about a "mechanism". It is solely about the statistical relationships of observations. Determinism implies that future observations can be predicted with no uncertainty given some information that exists in the past. Bell's experiment shows that is false, presuming only that information cannot travel faster than the speed of light (without which nothing makes sense, since there would be no meaning to an event being in the future of another event).

Oh, if time is the problem, there may be tricks to speed it up. Everything in the computer is 1s and 0s, so it works with them very quickly. Just pretend 1s are heads and 0s are tails (or the other way around).
Then you can use simple operations like AND and bit shifting, which computers do very quickly.
For for example you do the AND function for two bits. If it returns 1 then you know they're the same. If it returns zero you know they're different. Then you shift to the next bit and do it again. (lol, 0 and 0 outputs zero, but still... use XOR same idea).
When the answer is zero, just pretend they "cancel out" and when the answer is 1, increment a counter. What you'll be left with is the counter shows the "extra" heads or tails. Add half of your starting number to the "extra" for heads, and subtract "extra" from half the starting number for tails (or vice versa).
I've never tried this with python, but I assume it's possible. You can also start experimenting with C++ for example. Once you can do one language it's pretty easy to try some of the same things in a new one. Python tends to be slower which is why I mention it.
Python is slow for such applications...it's an interpreted language, not compiled. Some Python developers will try to claim it is compiled *and* interpreted, because Python has a runtime "compiler" that takes the interpreted bytecode and runs it in the Python virtual machine, but since it actually interprets the code each time it is run, it is an interpreted language, and the step of interpreting the code is just wasted cycles vs. a compiled executable.
Compiled executables are themselves somewhat inefficient, but very few people write Assembly Language code anymore, so a good compiler is about as fast as you are going to get.
The app itself is simple. You could write a coin flip app in a batch file , but that would also be incredibly slow.
Oh, so python is simply bad for such things, ok.
Not really. Python is (fairly) slow for sequential operations. However, vectorising your code using numpy means it is quite fast.
For example, tossing a million coins a hundred times, and print out the difference between the number of heads and the number of tails in each of the 100 trials.
This code takes just over half a second.
import numpy as np
from time import time
start = time()
x = np.random.randint(2, size=(1000000,100))
diffs = 2*np.sum(x, axis=0)-1000000
print(means)
print("That took", time()-start, "seconds")
Having noticed Ghostess post above, here is the vectorised way to do that once, also taking about half a second.
import numpy as np
from time import time
start = time()
x = np.random.randint(2, size=100000000)
diff = 2*np.sum(x)-len(x)
print("Surplus heads =", diff,"taking", time()-start, "seconds")
That's kind of the point, though. You wrote that code in a way that compiles and/or interprets faster, and that code snippet works largely unaltered in many languages. But your average Python programmer isn't going to write it that way.
I dispute that. Using numpy is very Pythonic, and is not in any sense an unusual or advanced technique. Virtually all my code starts with an import of numpy. numpy is 27 years old and is described accurately as "the fundamental package for scientific computing with Python".
"Vectorise" is generally a very early piece of advice given to Python programmers.

True. At this point, NumPy has become so ingrained in python that it's basically considered vanilla python.

@elroch, maybe you want to explain them that bell disproved Only local hidden variables proposed by Einstein, and by no mean disprove determinism? I'm not too hot about a thread of misinformation under my skirt.
You need to break causality in order to try to revive determinism. If you do that, there is no order of time, so nothing makes any sense. Far better to accept that determinism is dead in our Universe.
Really? still obfuscating? I’m not mad, I’m just disappointed. think it’s best we separate our own ways.
To be fair, maybe you or optimissed can start your own determinism thread and have everyone follow you there.
Thank you everyone for participating. this thread is officially close.
Be well and keep it real
Please do refer to some peer-reviewed work that backs up your objection.

It goes up and down a lot, in a major way. Is it me not understanding the axes or are the flips a bit on the pseudo-non-random side? I would expect a very high probability against what appears to be a non-random distribution.
Well, I have transformed the data in a special way that changes its properties. The first way is to rescale the differences as a ratio of the standard deviation of the difference (this goes up with the square root of the number of flips). The second way is to log-transform the x-axis, because this makes the behaviour look similar for any piece of the x-axis. (I think I have that right). The net result is that you have graph which is centred on zero (net difference zero) and tends to stay fairly near (with the frequency of the numbers on the y-axis being just like a standard normal distribution).
An interesting fact is that when the difference crosses zero it is likely to cross zero several more times in the near future. When it gets far from zero it is likely to take quite a while to get back again. At any time the coins don't know what the total is so far, so, untransformed it is a random walk starting whereever it happens to be.

Oh, calling it a random walk, that's a good observation... and of course an infinite random walk will visit all states... including all heads and all tails

@elroch, maybe you want to explain them that bell disproved Only local hidden variables proposed by Einstein, and by no mean disprove determinism? I'm not too hot about a thread of misinformation under my skirt.
You need to break causality in order to try to revive determinism. If you do that, there is no order of time, so nothing makes any sense. Far better to accept that determinism is dead in our Universe.
Really? still obfuscating? I’m not mad, I’m just disappointed. think it’s best we separate our own ways.
To be fair, maybe you or optimissed can start your own determinism thread and have everyone follow you there.
Thank you everyone for participating. this thread is officially close.
Be well and keep it real
You could try to say that the randomness of elementary particles doesn't scale up in a meaningful way, and therefore some version of determinism effectively exists.
I don't know the standard arguments for such things, this is just what I tend to think... that at least with free will we probably have a depressingly small amount.

Oh, calling it a random walk, that's a good observation... and of course an infinite random walk will visit all states... including all heads and all tails
Well, you will get all possible finite sequences with probability 1 (i.e. probability zero that even one of the infinite number of them is missing).
You will also visit every possible difference in the numbers of heads and tails with probability 1.
But there is also probability 0 of getting an infinite sequence of heads or tails, so that has probability 0 in, a countable sample of infinite sequences too. (Thus, you'd certainly need an uncountable number of tries).
NOTICE TO ALL PARTICIPANTS: UKE8 HAS ANNOUNCED THE "CLOSING" OF THIS TOPIC AND BLOCKED SOME PARTICIPANTS. PLEASE CAN EVERYONE INTERESTED MOVE THE DISCUSSION TO A NEW FORUM OPEN TO ALL? Please post there to acknowledge.
but since it actually interprets the code each time it is run,
lol !...can u even return hello world in python ?