Does True Randomness Actually Exist? ( ^&*#^%$&#% )

Sort:
Elroch

No. With all due respect, your understanding needs improving. We are talking about what is called a stochastic process (a sequence of random variables). Specifically a Bernoulli process.

This is mathematics, and happens to be a subject I have used a lot. Familiarity with the subject would help to think right.

Elroch
Optimissed wrote:

For an  infinite series of coin tosses, the probability of it occurring tends towards zero as the series tends towards its limit, so that at its limit of infinity, the probability is nominally zero. Yet in another part of the same discussion, you stated that any infinite series of binary choices contains every possible finite series.

No, I did not: it takes me one second (literally) to see the absurdity of such a claim. And you have no excuse for claiming this a second time, as I patiently clarified the point in my previous post #3876.

Also, your language is not precise enough. You (as well as I) need to be precise and formal to be reliable.

I have been attempting this after being insufficiently so early on.

For example, our notion of probabilities of (some) sets of sequences is based on the notion that the probability of the set of sequences that start with a specific N results is 2^-N. From this you can deduce that the probability of a single infinite sequence is zero as I would intuitively expect (because given delta, there is a set of probability less than delta which contains it). This is the more precise statement of the first half of your quoteed post. (The second half is incorrect).

Elroch
Optimissed wrote:
Elroch wrote:
Optimissed wrote:

For an  infinite series of coin tosses, the probability of it occurring tends towards zero as the series tends towards its limit, so that at its limit of infinity, the probability is nominally zero. Yet in another part of the same discussion, you stated that any infinite series of binary choices contains every possible finite series.

No, I did not: it takes me one second (literally) to see the absurdity of such a claim. And you have no excuse for claiming this a second time, as I patiently clarified the point in my previous post #3876

I didn't see the absurdity. Tell me why, all of a sudden, you find it absurd. Or, to be fair, tell me why it's absurd. I thought it was a bit questionable but I didn't think it absurd. You are certainly not more intelligent than I, so how come you found it absurd in one second? We can leave, for the moment, the fact that I think you did say that but, of course, may have alered the relevant post. However, it's possible I mixed you up with someone else, but I don't think so. If you didn't write that today then who did?

So why cannot an infinite series contain every possible finite series?

You are confusing yourself again.

Your earlier post discussed the proposition "ANY infinite series of binary choices contains every possible finite series." (your words)

Now you are talking about an entirely different proposition "does there exist an infinite series of binary choices that contains every possible finite series?" (my wording of your quote above)

To disprove a statement like the first "ANY ..." you just need to exhibit a counterexample.  My earlier linked posted exhibited the very simple counterexample sequence that is a sequence of nothing but heads. [The simplicity of the counterexample is why it would be absurd not to believe one existed].

To answer the second question affirmatively, you just need to exhibit a positive example. This is not difficult: have a go!

 

llama36
Elroch wrote:

First, it's worth headlining that I have been glibly writing as if there is a measure on the set of all sequences of flips of a fair coin. There is a powerful general result about the existence of a measure on the cartesian product of an infinite number of measure spaces but this does not mean that every set of sequences is measurable. It's worth trying to do this and see where it leads.

With a set of sequences, for each N, consider the set of truncated sequences with only N flips from the starts of the sequences in S. This defines a number

P(S, N) = K / 2^N

where K is the number of distinct initial sequences of length N in S.

P(S, N) is clearly (easily checked) a decreasing sequence, so its limit is a number in the interval [0, 1]. I would like this limit to be the probability of S, but it would need to satisfy the axioms.

To be a probability measure for all sets of sequences (seems too much to hope for) the complement of a set S (denoted ¬S) would have to have probability that is 1 - P(S).  To prove this would require proving:

P(¬S) >= 1 - P(S)

and

P(¬S) <= 1 - P(S)

The first follows from the fact that for any N,  it is quite easy to see that P(S, N) + P(¬S, N) >=1

This is clear because every sequence of N flips has to appear in either S or ¬S (it could be in both). So K + K' >= 2^N

(K' being the number of initial sequences in ¬S)

Is it possible to prove the inequality the other way? I think not.

It's easy to define a set S so that both S and ¬S contain every possible finite sequence at the start.  All you need to do is included all sequences all but finitely many of whose values are "H" in S and include all sequences, all but finitely many of whose values are "T" in ¬S.

[Note there are a lot of other sequences whose location is not specified]

Both S and ¬S contain sequences with every possible finite start. So that demolishes my naive attempt at defining the probability of an arbitrary set of sequences.  It's a safe bet this can't be fixed.

However, the attempted definition of probability is in some sense a meaningful bound on the size of a set of sequences. It works fine for probability zero, when we find that the probability of the starts of the sequences goes to zero so it is clear that it makes sense to say that the probability of the set of sequences (a smaller subset of the set of sequences than all sequences with those starts) is zero.

Likewise where I referred to "probability 1" this can make sense if we can bound the complement to have probability 0.

For example, consider the set of sequences that does not contain a single specific subsequence SS, let's say it's of length N. Then it doesn't contain SS at index 1:N, and it doesn't contain SS at index (N+1):2N and so on.  Each of these conditions reduces the bound on the probability by a factor of (1 - 2^-N). Thus the upper bound on the probability of this set of sequences is smaller than any positive number, so it is zero.

So that glib claim was meaningful.

Since a countable union of sets of probability zero has probability zero, we can see that it is also meaningful to say that the probability of a sequence containing all finite subsequences is 1. (The probability of a finite subsequence lacking at least one of them is zero, from that infinite union).

I'm not sure I followed everything that came before it, but the highlighted part made me smile, that's a clever way of thinking about it. I was just assuming it should be true, but your approach was more formal.

llama36
Optimissed wrote:

Let's say that hypothetically, you make a series of coin flips. Say you do 10^20 coin flips and each is heads. Hardly likely but nominally possible. All you have to do is keep flipping until you get a tails. That means that no longer is an infinite series of heads possible.

But the probability of flipping tails doesn't increase or decrease during this exercise.

Because the probability of flipping tails is never 100%, it's possible that you will never flip tails.

Elroch
llama36 wrote:

Since a countable union of sets of probability zero has probability zero, we can see that it is also meaningful to say that the probability of a sequence containing all finite subsequences is 1. (The probability of a finite subsequence lacking at least one of them is zero, from that infinite union).

I'm not sure I followed everything that came before it, but the highlighted part made me smile, that's a clever way of thinking about it. I was just assuming it should be true, but your approach was more formal.

Thanks. It was nice that some results that were interesting were easy enough to prove.

Given that I failed to be entirely clear earlier it's worth formalising the definition of the measure space a bit.

It's the implementation of the general construction of a measure space on a countable product of measure spaces. The individual measure spaces are of course the individual coin flips,  each a space with two events with probability 1/2. This measure extends trivially to subsets of the set of sequences where all but a finite number of the coin flips are unspecified (this is part of the general construction). i.e. the set which specifies N of the coin flips has measure (=probability) 2^-N,

The general construction then extends to the sigma-algebra of sets of sequences generated by these simple subsets - this is closed under three operations - countable union, intersection and set complement.  A set defined by any of these operations has a uniquely defined measure. [This paragraph may be a bit obscure to non-mathematicians, especially as I am omitting the details. Here they are in "case 1" of this paper: http://alpha.math.uga.edu/~pete/saeki.pdf].

You can consistently enlarge the space of measurable sets further by extending the sigma algebra of sets with all subsets of sets of measure zero, (each allocated measure zero, of course).

I had this in mind when I referred to the probability of certain sets.

Note the proper construction does better than my naive attempt earlier on which only used part of the construction.  For example, the set of sequences which terminate with an infinite number of heads is a countable union of (single sequence) sets which (like all single sequence sets) has measure zero, so it has measure (i.e. probability) zero.

 

llama36
Optimissed wrote:
llama36 wrote:
Optimissed wrote:

Let's say that hypothetically, you make a series of coin flips. Say you do 10^20 coin flips and each is heads. Hardly likely but nominally possible. All you have to do is keep flipping until you get a tails. That means that no longer is an infinite series of heads possible.

But the probability of flipping tails doesn't increase or decrease during this exercise.

Because the probability of flipping tails is never 100%, it's possible that you will never flip tails.


No, it isn't possible. Literally impossible. It's a formalised way of thinking, which you've learned from Elroch, unfortunately. It means you think ideals are real but also it means that neither of you understand the meaning of infinity. In particular, you don't understand the meaning of "never".

Formalized ways of thinking create things like medicine that cures disease and rockets that have sent people to the moon.  Intuition had us worshiping stone and bronze statues for 1000s of years.

Sure intuition makes more sense, but humans are dumb, so that's not a good standard.

Elroch
Optimissed wrote:

Elroch is a mathematician and mathematics is used to represent aspects of reality in isolation from other aspects. It can be no other way, because all represented aspects have to be stylised or represented mathematically, which, inevitably, unless the concept is already simple, meand simplification and the omission of extraneous and almost random influences which, in the context of normal operations, are considered not to influence results sufficiently that they must be included. That's how science works. However, dealing with matters like infinity is a different matter. Infinity is a mathematical and conceptual ideal and yet you are trying to use it to predict real events.

This seems to be from your imagination. What real events?

All you are doing is depicting one side of tthe equation, as it were .... the rather ridiculous notion that an infinite series of binary, chance events will not exibit binary behaviour.

???

That wouldactually mean that thy were not binary events because they didn't act as such, but that's a trivial explanation, really.

It's not an explanation, it is a nonsense.

The true explanation is that Elroch is using a mathematical trick in considering each event in isolation.

Certainly not ONLY in isolation. Also in finite sequences, infinite sequences, sets of infinite sequences, countable sets of sets of infinite sequences and anything else that was needed. 

It's similar, as I pointed out, to the idea that the hare will never quite catch the tortoise.

An analogy which makes no sense. Do try to justify it!

It doesn't depict reality.

This is true. There are no infinite sequences of coin tosses in the real world. Period.

In the tortoise/hare case, the maths used is incorrect. It will, in fact, be the same here. It only remains to find the mistake.

Yeah, you can start by refuting that peer-reviewed paper and the body of knowledge on the subject of stochastic processes. Just kidding! grin.pnggrin.pnggrin.png

In reality, in an infinite series, you can continue to toss a coin unttil you get the result you want. You and Elroch, however, are proposing that an infinite series has an end.

No, we never did at any point. But we agree that an infinite sequence of heads is a possible infinite sequence of results. It has equal standing with every single other specified sequence of results.

It doesn't and so you're incorrect.

Strawmen are great opponents. Keep on fighting them!

 

Elroch

It is unfortunate that your "super-intelligence" can't remember who said what or bother to check a few pages to find out,

I have found that it was actually @llama36 who stated that every infinite sequence of flips contains every finite sequence of flips (jn post #3857). However, he was being imprecise and merely meant that this was true with probability 1 (a result I verified later).

We should all be able to agree (by exhibiting an explicit example) there are infinite sequences of bits that do not contain any specific subsequence we might want to choose!

Likewise your super-intelligence claimed I had somewhere stated the ridiculous falsehood that all infinite sequences terminate. That is why you arguing against this falsehood was a straw man. It's the definition of "straw man" - arguing against something that someone never said.

(On this occasion, I believe no-one at all said it - it seems to be entirely imaginary).  

llama36

Yeah, it's useful to note that flipping all heads is just as likely as any other e.g. flipping an infinite sequence of alternating H T H T.

Sometimes in traffic I'll flip a coin to see how many in a row I can get... the conceit is that I ignore the fact every time I do 100 flips, I had a 2^-100 chance of getting that result i.e. every result is extraordinarily rare, but as a pattern-seeking human, it's fun to pick a target like all heads and pretend everything else is "common."

llama36

Also I haven't read #3892, I'll probably do that tomorrow tongue.png

Thee_Ghostess_Lola

i'll code a program to flip a coin. howzat ?...how many times do u want it to ?...100BB, 10T, 1Q ?...and u'll see for urself that it wont ever-EVER reach par. i stop the random samples if it ever does (but it wont - trust me). so 4u elrock ?...well u can stop flipping a penny long & late into the nite...i got s/t better lol ! 

gimme a few dayz...im bizzy doing s/t else right now...like getting ready 4my B-day thingy thats gonna go til wednesday (YEE !!) & were partying W/ Cookie cuz shes into dia de muertos (shes from PR). 

Elroch
Optimissed wrote:
Elroch wrote:

It is unfortunate that your "super-intelligence" can't remember who said what or bother to check a few pages to find out,

I have found that it was actually @llama36 who stated that every infinite sequence of flips contains every finite sequence of flips (jn post #3857). However, he was being imprecise and merely meant that this was true with probability 1 (a result I verified later).

You incorporated it into an explanation, as I showed, so you did hold it as true.

No, you did not. I see you are not providing a link to the imaginary post you refer to. The point is as obvious to me as 1+1 = 2 and I don't make such mistake.

Probability 1 means that something will definitely happen. Probability 0 means that it won't happen. If you can demonstrate probability 1 to be something else, I'd be very interested in seeing how you manage it. There's a low probability that you can.

You seem unfamiliar with probability theory on uncountable spaces. For example, suppose you have a real-valued random variable with mean zero and standard deviation 1.  The probability that this random variable will take ANY SPECIFIC VALUE is zero. This does not mean that the random variable can take no values.

Measure theory is the mathematical theory that deals with this. The way it works is that it constructs the correct way to allocate probabilities to certain sets (called Borel-measurable sets) and this is enough for all practical purposes. 

So the notions of "probability zero" and "impossible" are definitely not the same in many important cases. One of these is the Bernoulli Process (which is what mathematicians call what we are discussing).

We should all be able to agree (by exhibiting an explicit example) there are infinite sequences of bits that do not contain any specific subsequence we might want to choose!

Likewise your super-intelligence claimed I had somewhere stated the ridiculous falsehood that all infinite sequences terminate. That is why you arguing against this falsehood was a straw man. It's the definition.

Yes I noticed that you insinuated that you didn't claim that an infinite sequence terminates.

I didn't insinuate it. I stated it as a solid fact, based on perfect knowledge. You have failed to support your ridiculous claim.

However, by insisting that it need not contain a "heads" and that all the results might be "tails", you were unknowingly claiming that it terminates.

This is nonsense. As an analogy, the recurring decimal for 7/9 = 0.77777...  contains only 7s and does not terminate. Likewise an INFINITE sequence of heads does not terminate. You are a bit confused.

Or that you don't understand what infinity is meant to represent.

Projection. I deal with precise, well-defined concepts.

"infinity" is not a thing (this term is not associated with any defined mathematical or physical concept).

Rather "infinite" is an adjective that applies to many things. It means "not finite". For example an infinite sequence of heads is a sequence of heads that is not finite (i.e. terminating). There are many infinite cardinals (sizes of sets ignoring any additional structure), starting with aleph-null, the cardinal of the counting numbers. The cardinal of the set of infinite sequences of binary variables is a larger infinite cardinal.

(On this occasion, I believe no-one at all said it - it seems to be entirely imaginary). 
No, your claims implied it. Not strongly but as an imperative.

I see a striking lack of any link to a post to support that. 

 

llama36

I've only had a handful of undergraduate math classes. Only one of them was all statistics (random variables, Bernoulli distribution, etc). In other words these are basic building blocks. The idea that for every expert who agrees with Elroch there will be another who agrees with you doesn't make sense. Mathematics makes use of infinity all the time even though there is no physical representation of it. It's a very useful concept.

Of course in reality we can't flip a coin an infinite number of times, but I don't know why an infinite sequence of heads (or any other infinite sequence) is hard to imagine as a concept.

An again, an infinite sequence of H is just as unlikely as any other specific infinite sequence, so there's no reason to pick on that one. This is probably self evident after realizing future flips don't depend on past flips.

Elroch

I would take care to say mathematics makes use of objects that are infinite all the time. There is no mathematical object called infinity. Being infinite is a property. Actually, it is a number of concepts made clear by the context. For example, the natural numbers are infinite (in cardinality) but the sum of an infinite series can be infinite (here it is a measure, not a cardinality).

It is true that a special point, sometimes called infinity, can be appended to the real line (making it topologically into a circle) or two points called +infinity and -infinity can be added (making it topologically into a closed interval), and another point called infinity turns the complex plane into the Riemann sphere,  but it is best to remember they are not the same thing (although the first and third of them can be identified). 

llama36
Optimissed wrote:

Really this isn't maths. It's beyond mathematics.

Maybe pre-history mathematics, because dealing with infinity has been around for at least a millennia

https://en.wikipedia.org/wiki/Cardinality#History

From the 6th century BCE, the writings of Greek philosophers show the first hints of the cardinality of infinite sets.

 

And of course basic level calculus makes use of infinity all the time, from integration to series to limits.

llama36

Honestly I don't even know what the disagreement is... maybe I'm too sleepy (about to go to bed).

All I remember is you taking issue with the concept of an infinite set of coin flips because if someone were to try to replicate it they wouldn't be able to... well yeah, no one can flip a coin infinitely many times. So we agree... lol

Elroch
llama36 wrote:
Optimissed wrote:

Really this isn't maths. It's beyond mathematics.

Maybe pre-history mathematics, because dealing with infinity has been around for at least a millennia

https://en.wikipedia.org/wiki/Cardinality#History

From the 6th century BCE, the writings of Greek philosophers show the first hints of the cardinality of infinite sets.

 

And of course basic level calculus makes use of infinity all the time, from integration to series to limits.

But does it? What is the "infinity"?

Doesn't it rather make use of things that are infinite, such as sequences?

It deals with both countably infinite things (such as those sequences, from which limits can be derived) and sometimes with infinite values (like the integral of 1/x in the interval [0,1], which might be reached as the limit of a sequence of approximations that keeps getting bigger without a bound.

llama36

Sure, your language is more precise.

And yes, infinities pop up in different places, such as an infinite continuum of values on a finite interval when doing integration. It may start as a summation of discrete values, but then you keep adding terms to add precision, and in the limit you get the integration.

And it's not philosophical or some kind of trick (or whatever optimissed is suggesting). Here's a nice picture showing, intuitively, that the geometric series of 1/2 + 1/4 + 1/8 + . . . = 1

llama36

And so you start with something small like this, and work your way up.

I've already forgotten various infinite summation relationships for geometric and power series and etc... but just wanted to show something simple like this.

And then with that as a base, mathematics gets more complex, to the point of being useful enough to solve real world problems... and because it's successful at solving those problems (like manufacturing medicine or sending rockets to the moon) you can be fairly sure that it's not superstition. It's not something that you can disagree with based purely on intuition, because it's really working.

It's something that, if destroyed, could be rediscovered. It doesn't rely on the fact that scientists or journals agree with it.