Does True Randomness Actually Exist? ( ^&*#^%$&#% )

Sort:
Elroch

I do not know what you mean.

The sort of thing I mean is this. Someone is observing a spaceship moving past at tremendous speed and sees two events occurring on the spaceship that are (of necessity) separated by more than a Planck time in their frame. But in the time-dilated frame of the spaceship these two events occur closer together. Maybe less than a Planck time apart. If so, their times cannot be distinguished.

I feel there is some understanding to be gleaned by resolving this.

TheBestBeer_Root
Optimissed wrote:
TheBestBeer_Root wrote:
Elroch wrote:

Randomness is basically that which is not known. I know that sounds too simple but it is justifiable.

……etc

…is consciousness known? What if I said all IS Consciousness, yet as there’s rated and unrated games so do we have the majority of consciousness in the un sense.. Why, you ask? That answer entails delving into what’s very sadly not allowed, therefore we continue on with such random pages of lmao asking whether if randomness exists. 😂


I wouldn't think rocks are conscious so maybe you mean it in another sense of the word?

I’m certainly meaning from a not of this worldly  ‘sense of perspective’ . Only from the true spiritual reality. Something I’m absolutely certain of, given ONE is All in All, One that is not even allowed to be discussed. …..sadly to say =/ 

…..The  ONE  That IS Life. happy

TheBestBeer_Root

…. lol I could give a flying flip fiddle to any worldly perspectives 😂

Elroch
Optimissed wrote:
Elroch wrote:

I do not know what you mean.

The sort of thing I mean is this. Someone is observing a spaceship moving past at tremendous speed and sees two events occurring on the spaceship that are (of necessity) separated by more than a Planck time in their frame. But in the time-dilated frame of the spaceship these two events occur closer together. Maybe less than a Planck time apart. If so, their times cannot be distinguished.

I feel there is some understanding to be gleaned by resolving this.


Yes, that's the sort of thing, frames being delineated by their velocities relative to one-another. At a guess, it might mean that Planck Time is not a fixed constant but a variable governed by relativity.

The thing is that the Planck time is a fixed time, about 10^-43 seconds.
In some cases - the Planck mass and Planck energy - the quantity is not a minimum in any sense, since the mass is much larger than those of known particles and atoms (and the energy enormously greater than that of most photons etc.)  It has a meaning as the smallest possible black hole mass, I believe, for technical reasons.

Thee_Ghostess_Lola

sleep tite opti...happy.png...

llama36
Elroch wrote:
Wits-end wrote:

Maybe a dumb question, (yes, yes I remember what Dad said about dumb questions) naive might be a better choice of words, but here goes… if true randomness exists, is there an over arching structure to it that we are unable to comprehend? 

Probability theory is the quantitative study of randomness. We understand it rather well, even though it seems so difficult to pin down.

As a simple example, we all have some sort of intuitive concept of an event that has two outcomes where the outcomes are equally likely. This can be viewed as the unique symmetrical probability distribution for two outcomes, or as the highest entropy state of knowledge (the greatest uncertainty, or "randomness").

But it seems impossible to pin down exactly what it means for an event to have two equally probable outcomes when, in the real world, the only possibilities are that one or the other happens.

To see why it is so difficult to pin down, we should try to do so. We say what it means is that, if the event were repeated many times we would get very similar numbers of each event.  But how similar? Any number of each is possible, but unbalanced numbers are increasingly unlikely. But what is unlikely?  We we could say that if you toss a coin 1,000,000 times there is only some small probability that the difference in the number of heads and tales is more than 10,000 (1%). and the probability of this is much smaller if we do it 1,000,000,000 times.

But now we have defined what one probability (those for one coin) means by using the probability of some more complex event.  But that leaves the question of how we define the probability of this more complex event, which is no simpler than the question we are trying to answer.

Of course to do this properly you would imagine doing it for an infinite number of tosses and say the limit of the proportion is the probability.  Well, at least it is so with probability 1. It is not impossible to toss a fair coin an infinite number of times and have the proportion of heads always more than 1% higher - it just has probability zero.

So the ground keeps vanishing. Yet we know it makes sense.

Can you elaborate on the highlighted portion? By definition this is impossible yes?

An infinite number of flips of a fair coin will contain all possible sequences of finite length. So while some groups of sequences will have the proportion of heads be 1% higher, there will be an equal number of groups with tails 1% higher due to symmetry.

If it does not contain all finite sequences, then it wasn't an infinite number of flips (or wasn't a fair coin).

Yes the "ground fanishes" in the sense that we need axioms, but I'm not sure about the highlighted statement.

llama36

Although, this is kind of gross... I've never thought about the cardinality of infinite coin flips heh...

An infinite number of flips could contain an infinite subset of infinite flips the same way the counting numbers have an infinite subset of even numbers, an infinite subset of numbers divisible by 5, etc.

So, for example, if you flip a coin an uncountably infinite number of times, even if it's a fair coin, it could come up heads an infinite number times? This is unintuitive and werid.

terryza
Nothing really happens randomly there must have been a cause of action some where somehow for whatever that is about to happen
llama36
terryza wrote:
Nothing really happens randomly there must have been a cause of action some where somehow for whatever that is about to happen

Nah, quantum stuff breaks casuality all the time, e.g. Bell's theorem.

Elroch
llama36 wrote:
Elroch wrote:
[snip[
It is not impossible to toss a fair coin an infinite number of times and have the proportion of heads always more than 1% higher - it just has probability zero.

So the ground keeps vanishing. Yet we know it makes sense.

Can you elaborate on the highlighted portion? By definition this is impossible yes?

An infinite number of flips of a fair coin will contain all possible sequences of finite length.

No, not necessarily. This is true with probability 1, but it is easy to describe sequences that don't.  Eg:

H T H T H T H T ...

So while some groups of sequences will have the proportion of heads be 1% higher, there will be an equal number of groups with tails 1% higher due to symmetry.

If it does not contain all finite sequences, then it wasn't an infinite number of flips (or wasn't a fair coin).

Yes the "ground fanishes" in the sense that we need axioms, but I'm not sure about the highlighted statement.

I hope it is clear now.

Note that it is also easy to describe a sequence with any proportion of heads you choose (any real number in the interval [0,1]  So such sequences are not impossible, just unlikely.

Here is a general way to describe a specific sequence with the limit of the proportion of heads being p (where p is any real value in [0, 1]

The first coin is a head. To determine what each coin is after that, if the proportion of heads so far is greater than p, it is a tail. If it isn't, it is a head.

It is easy to see the limit of the proportion of heads in this sequence is p.

[As an aside, note that there are many sequences with no limit for the proportion of heads. But the probability of such a sequence is zero. This is a consequence of the stronger result that for any small positive delta, it is true with probability 1 that there exists N such that  the proportion of heads in the first M elements of the sequence is always between 1/2 - delta and 1/2 + delta if M > N].

Sorry to non-mathematicians for any headaches.

 

llama36
Elroch wrote:
llama36 wrote:
Elroch wrote:
[snip[
It is not impossible to toss a fair coin an infinite number of times and have the proportion of heads always more than 1% higher - it just has probability zero.

So the ground keeps vanishing. Yet we know it makes sense.

Can you elaborate on the highlighted portion? By definition this is impossible yes?

An infinite number of flips of a fair coin will contain all possible sequences of finite length.

No, not necessarily. This is true with probability 1, but it is easy to describe sequences that don't.  Eg:

H T H T H T H T ...

So while some groups of sequences will have the proportion of heads be 1% higher, there will be an equal number of groups with tails 1% higher due to symmetry.

If it does not contain all finite sequences, then it wasn't an infinite number of flips (or wasn't a fair coin).

Yes the "ground fanishes" in the sense that we need axioms, but I'm not sure about the highlighted statement.

I hope it is clear now.

Note that it is also easy to describe a sequence with any proportion of heads you choose (even an arbitrary real value in the interval [0,1]  So such sequences are not impossible, just unlikely.

 

I guess I don't understand what you mean by "this is true with probability 1"

I'm trying to think of a better way to say what I was thinking.

Let's say we try to categorize infinitely long sequences of a coin flip. Some will contain all H, some H T H T repeating, etc while others (due to having no pattern) will contain every possible sequence. I'm guessing there must be some theorem about there being essentially infinitely many more sequences that have maximum entropy.

So could we say that the probability of producing a low entropy sequence after infinite flips is actually zero?

Elroch

Well, as an example, it is certainly possible for an infinite sequence of coin flips to all be heads. (It is just as possible as any other specific sequence of results).

It also has probability 0

To prove this, observe that, for any N, the probability P of this is no more than the probability of getting N heads, which is 2^-N.

Thus P < 2^-N for all N

and the only positive P that satisfies this is P=0.

The flipside of this is that NOT getting all heads has probability 1.

llama36
Elroch wrote:

Well, as an example, it is certainly possible for an infinite sequence of coin flips to all be heads. (It is just as possible as any other specific sequence of results).

It also has probability 0

To prove this, observe that, for any N, the probability P of this is no more than the probability of getting N heads, which is 2^-N.

Thus P < 2^-N for all N

and the only positive P that satisfies this is P=0.

Sure, just like each event in a CDF had probability zero, but infinitely many of these events sum to 1.

llama36

I guess it goes back to what you're saying about a more complex event.

Sure, each flip is independent, and every sequence is equally likely.

But I suppose the set of infinite sequences that have maximum entropy is infinitely larger than sets with less entropy... maybe not true for coin flips, but I imagine in most cases this is true. Maybe this is why in reality we never flip an infinite sequence of H... or maybe that's just a convenient way of thinking about it.

llama36
Optimissed wrote:
llama36 wrote:

Although, this is kind of gross... I've never thought about the cardinality of infinite coin flips heh...

An infinite number of flips could contain an infinite subset of infinite flips the same way the counting numbers have an infinite subset of even numbers, an infinite subset of numbers divisible by 5, etc.

So, for example, if you flip a coin an uncountably infinite number of times, even if it's a fair coin, it could come up heads an infinite number times? This is unintuitive and werid.


Yes but an infinite row of either heads or tails is impossible in that it cannot possibly occur within an infinite number of coin flips. Think about it.

The events are independent, so it's possible to flip heads indefinitely.

But sure, the limit of 1/2^n as n -> infinity is zero.

And as a practical matter you can't repeat an event infinitely many times.

Elroch

First, it's worth headlining that I have been glibly writing as if there is a measure on the set of all sequences of flips of a fair coin. There is a powerful general result about the existence of a measure on the cartesian product of an infinite number of measure spaces but this does not mean that every set of sequences is measurable. It's worth trying to do this and see where it leads.

With a set of sequences, for each N, consider the set of truncated sequences with only N flips from the starts of the sequences in S. This defines a number

P(S, N) = K / 2^N

where K is the number of distinct initial sequences of length N in S.

P(S, N) is clearly (easily checked) a decreasing sequence, so its limit is a number in the interval [0, 1]. I would like this limit to be the probability of S, but it would need to satisfy the axioms.

To be a probability measure for all sets of sequences (seems too much to hope for) the complement of a set S (denoted ¬S) would have to have probability that is 1 - P(S).  To prove this would require proving:

P(¬S) >= 1 - P(S)

and

P(¬S) <= 1 - P(S)

The first follows from the fact that for any N,  it is quite easy to see that P(S, N) + P(¬S, N) >=1

This is clear because every sequence of N flips has to appear in either S or ¬S (it could be in both). So K + K' >= 2^N

(K' being the number of initial sequences in ¬S)

Is it possible to prove the inequality the other way? I think not.

It's easy to define a set S so that both S and ¬S contain every possible finite sequence at the start.  All you need to do is included all sequences all but finitely many of whose values are "H" in S and include all sequences, all but finitely many of whose values are "T" in ¬S.

[Note there are a lot of other sequences whose location is not specified]

Both S and ¬S contain sequences with every possible finite start. So that demolishes my naive attempt at defining the probability of an arbitrary set of sequences.  It's a safe bet this can't be fixed.

However, the attempted definition of probability is in some sense a meaningful bound on the size of a set of sequences. It works fine for probability zero, when we find that the probability of the starts of the sequences goes to zero so it is clear that it makes sense to say that the probability of the set of sequences (a smaller subset of the set of sequences than all sequences with those starts) is zero.

Likewise where I referred to "probability 1" this can make sense if we can bound the complement to have probability 0.

For example, consider the set of sequences that does not contain a single specific subsequence SS, let's say it's of length N. Then it doesn't contain SS at index 1:N, and it doesn't contain SS at index (N+1):2N and so on.  Each of these conditions reduces the bound on the probability by a factor of (1 - 2^-N). Thus the upper bound on the probability of this set of sequences is smaller than any positive number, so it is zero.

So that glib claim was meaningful.

Since a countable union of sets of probability zero has probability zero, we can see that it is also meaningful to say that the probability of a sequence containing all finite subsequences is 1. (The probability of a finite subsequence lacking at least one of them is zero, from that infinite union).

 

Thee_Ghostess_Lola

im not really sure why ppl keep using small-time coin-flipping as the answer to all things random. plz tell me u u/s that it isnt. and if u sadly do ?...plz tell me gravity is NOT affecting the toss yet gravity is not static anywhere ? so plz stop w/ the arithmetic and go to summa our U-laws ?...like space & temperature & light & stuff like that.

help my frustration - pretty plz ?

Elroch
Optimissed wrote:
Elroch wrote:

Well, as an example, it is certainly possible for an infinite sequence of coin flips to all be heads. (It is just as possible as any other specific sequence of results).

(Elroch is mistaken here, however. I explained why it is not possible.)

It also has probability 0

That means it's impossible.

This is not a logically consistent position.

To see this, observe that EVERY specific infinite sequence of results has probability zero. Do you really want to say that every infinite sequence of results is impossible? This is not useful as it contradicts the existence of such sequences, hence the entire analysis.

The lack of identification between "possible" and "non-zero probability" is characteristic of measures on uncountably infinite sets (probability distributions on countably infinite sets are not much different to those on finite sets).  For example, quantum mechanics considers the probability distribution of the location of a particle. For all but extreme distributions, the probability of a set containing a single location is zero. But it would not be useful to say all locations are impossible.

Elroch
Optimissed wrote:
Elroch wrote:
Optimissed wrote:
Elroch wrote:

Well, as an example, it is certainly possible for an infinite sequence of coin flips to all be heads. (It is just as possible as any other specific sequence of results).

(Elroch is mistaken here, however. I explained why it is not possible.)

It also has probability 0

That means it's impossible.

This is not a logically consistent position.

To see this, observe that EVERY specific infinite sequence of results has probability zero. Do you really want to say that every infinite sequence of results is impossible? This is not useful as it contradicts the existence of such sequences, hence the entire analysis.

The lack of identification between "possible" and "non-zero probability" is characteristic of measures on uncountably infinite sets (probability distributions on countably infinite sets are not much different to those on finite sets).  For example, quantum mechanics considers the probability distribution of the location of a particle. For all but extreme distributions, the probability of a set containing a single location is zero. But it would not be useful to say all locations are impossible.

I'm afraid this is one of those situations where you have failed to follow the logic. I could explain it but why don't you try to see it for yourself? This follows the same sort of logic as the kind of conundrum that has it that the hare can never overtake the tortoise.

It is you who should be able to understand that if you want to study a set of sequences and deduce (incorrectly) that every specific sequence is "impossible" then you are doing it wrong. Read any book dealing with measure spaces on uncountable domains and you will see my viewpoint is correct.

There is no doubt I'm right and I will explain it if you don't see it for yourself.

See above. But go ahead and entertain.

However, I could point out an inconsistency in your statements, which might help you to pinpoint the source of error. You explained, I think to Llama, that an infinite series of heads and tails must contain every conceivable finite sequence of heads and tails.

No, there is definitely no post of mine that says that. It is trivial to exhibit the counterexample of the sequence only containing heads.

Rather, I proved the striking result that the probability of the set of sequences that does not contain every finite sequence of heads and tails is zero.

 

GraysonRosenbaumm
Ow, my brain hurts