**From:** Wei Dai (*weidai@weidai.com*)

**Date:** Mon Nov 05 2007 - 17:06:29 MST

**Next message:**Kevin Peterson: "Re: how to do something with really small probability?"**Previous message:**Rick Smith: "Re: Re: Anti-transhumanist crap on Kuro5hin"**In reply to:**Rolf Nelson: "Re: how to do something with really small probability?"**Next in thread:**Rolf Nelson: "Re: how to do something with really small probability?"**Reply:**Rolf Nelson: "Re: how to do something with really small probability?"**Messages sorted by:**[ date ] [ thread ] [ subject ] [ author ] [ attachment ]

Rolf Nelson wrote:

*> How do you know *anything*? You have a Bayesian "prior distribution",
*

*> which may include anthropic reasoning.
*

I'm assuming Bayesian reasoning as well. I don't think anthropic reasoning

is relevant to this issue.

*> Obviously if a bounded-rationality agent is aware that it's a
*

*> uniformly random program, then once it has seen that x is its output,
*

*> it should (if it is sophisticated, and it has nothing better to do
*

*> with its time) give x a probability on the order of the amount of
*

*> Chaitin's Omega that it doesn't know. So what? You're begging the
*

*> question of why it had this prior in the first place. The prior
*

*> certainly isn't true of the programs running on my PC; none of my
*

*> programs are drawn from uniformly random distributions (not even
*

*> Microsoft Word).
*

I was using the standard prior for Solomonoff Induction. I see Nick Hay has

written an exposition of the concept at

http://www.intelligence.org/blog/2007/06/25/solomonoff-induction/. Quoting from

it:

Solomonoff induction predicts sequences by assuming they are produced by a

random program. The program is generated by selecting each character

randomly until we reach the end of the program.

*>> Now is it
*

*>> possible that SI can take an arbitrary string x and tell us whether P(x)
*

*>> <
*

*>> 1/2^(2^100)?
*

*>
*

*> Underspecified. If by "probability" you only mean "something that
*

*> obeys the Probability Axioms, and is also sometimes useful", then
*

*> sure. If an agent has bounded rationality, it can consistently say
*

*> "there is a 1/2 probability that any number between 1 and 10 is an
*

*> even number. There is a 1/2 probability that 5 is an even number.
*

*> There is a 1/2 probability that 6 is an even number."
*

This P is supposed to be the same function as before (i.e., the standard

prior for Solomonoff Induction).

Does that clear up the point I was trying to make?

**Next message:**Kevin Peterson: "Re: how to do something with really small probability?"**Previous message:**Rick Smith: "Re: Re: Anti-transhumanist crap on Kuro5hin"**In reply to:**Rolf Nelson: "Re: how to do something with really small probability?"**Next in thread:**Rolf Nelson: "Re: how to do something with really small probability?"**Reply:**Rolf Nelson: "Re: how to do something with really small probability?"**Messages sorted by:**[ date ] [ thread ] [ subject ] [ author ] [ attachment ]

*
This archive was generated by hypermail 2.1.5
: Wed Jul 17 2013 - 04:01:00 MDT
*