1) You didn't comment on the bayesian viewpoint that probability reflects a subjective idea about the state of the world. One might argue, for example, that probability isn't measurable, and that therefore, strictly speaking, a statement about the objective probability of an event isn't meaningful. Experimental evaluation would have to be done on an entire model instead.
Do you have any objections to that point of view?
2) I don't find the case about Kolmogorov complexity to be actually convincing, at least not as per the requirements the rest of the article sets. "3141592..." could pass as either "random digits" or "first digits of pi". The fact that it's highly unlikely a true RNG would have generated exactly those, we are back to a frequentist argument there.
It's likely I'm missing something, could you elaborate more or give me a pointer?
Isn't that what the occam's razor argument was for? Sure, both an RNG and the "40 digits of pie" program can produce that output, but the "40 digits of pie" program is shorter, therefore having less kolmogorov complexity
For all I know this is a false dichotomy. True RNGs don't exist, for one, in a deterministic philosophy.
RNGs are per definition non-deterministic, but algorithms are deterministic--by the definition that I learned. We use floating gates picking up cosmic background radiation or the fallout from radioactive elements to come close to really random. XKCD's butterfly joke applies.
Computable numbers have a computable complexity. For a non-deterministic program the fair comparison to a true RNG would be a programm that outputs all digits of Pi up to infinity, I suppose. Vice-versa, I'm not sure if a pseudo-random number generator couldn't be fit into the equivalent kolmogorov complexity of a 40-digits of PIE.
Are you saying it couldn't be? On the one-hand it is the case that pseudo RNGs simply rely on intractible complexity, which must regularly supercede 40 digits of Pi equivalent to 40 byte key, as opposed to your 4kb RSA key chains and what not.
On the other hand I think your argument is akin to the gamblers fallacy: I have absolutely no clue but I will hope it will not have been a primitive RNG, right?
For reference, here's a simple pseudo RNG taken from TinyPTC example code to produce a TV-noise graphic in a loop
Yeah but is it? They are both programs with logarithmic Kolmogorov complexity, and Kolmogorov complexity is only defined up to constant factors unless you commit to a computational model.
If you do commit to one, which is the shortest is just a function of what are the specifics of the model you chose, which isn't really interesting, it's literally code golf at that point.
In order to discriminate between the output of an RNG and the digits of PI, you are right that they have the same Kolmogorov Complexity to within additive constants.
However, you can ask for resource-bounded Kolmogorov complexity. Suppose you ask for the shortest length of the description of a PTIME machine which output the N bits of an RNG versus that which outputs the first N bits of PI. Complexity theorists believe that the first will be longer. Proving that conclusively might possibly have a bearing on P=BPP.
When you go all the way down to finite-state machines, KC becomes something equivalent to finite-state information-lossless compression, like Lempel-Ziv compressibility.
Long story short: by restricting the resources available for the computation, it is possible to discriminate among some such examples as you point out.
> They are both programs with logarithmic Kolmogorov complexity
In the case of true random numbers, how is that so?
Very few random sequences can be generated by a logarithmic-sized program, since most strings are not significantly compressible [1]. A simple counting argument shows that: there are 2^n strings of n bits, but only 2^(lg n) = n logarithmic-sized strings, a much smaller number!
1) I rather like the subjective view! Bayesians used to struggle to justify a choice of prior, but it turns out that 2^-K(x) is universal in the sense that it never falls below a constant factor of any given (semi-)computable finite (semi)-measure.
2) Sorry, I should have clarified that the programs are deterministic. So if you want to use an RNG, you also have to supply a string of random bits that cause the RNG to output forty digits of pi.
A couple observations/questions.
1) You didn't comment on the bayesian viewpoint that probability reflects a subjective idea about the state of the world. One might argue, for example, that probability isn't measurable, and that therefore, strictly speaking, a statement about the objective probability of an event isn't meaningful. Experimental evaluation would have to be done on an entire model instead. Do you have any objections to that point of view?
2) I don't find the case about Kolmogorov complexity to be actually convincing, at least not as per the requirements the rest of the article sets. "3141592..." could pass as either "random digits" or "first digits of pi". The fact that it's highly unlikely a true RNG would have generated exactly those, we are back to a frequentist argument there. It's likely I'm missing something, could you elaborate more or give me a pointer?