From: Peter de Blanc (firstname.lastname@example.org)
Date: Tue Jan 06 2009 - 19:39:46 MST
Benja Fallenstein wrote:
> Question: Can a Turing
> machine X, based on any criteria at all, form a computable probability
> distribution that assigns probability >1/n to a specific hypothesis Y
> such that K(Y) > K(X) + K(n)? The answer isn't clear to me, although
> my intuition is that it is "no."
That's an interesting question! The answer is yes. For example, if n =
3^^^^3 and Y is any fairly (but not ludicrously) complex hypothesis.
To get an answer of "no," I think you should ask:
Can a Turing machine X form a computable probability distribution that
assigns probability >p to a specific hypothesis Y such that K(Y) > K(X)
- log(p) + C? Then if you choose the right constant C, the answer is "no."
This is because, using entropy coding, we can specify Y with -log(p)
bits given X. We need K(X) bits to specify X, and C bits to specify an
entropy decoder. So K(Y) <= K(X) - log(p) + C.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT