From: Vladimir Nesov (firstname.lastname@example.org)
Date: Wed Nov 26 2008 - 12:12:15 MST
On Wed, Nov 26, 2008 at 8:48 PM, Matt Mahoney <email@example.com> wrote:
> --- On Tue, 11/25/08, Nick Tarleton <firstname.lastname@example.org> wrote:
>> Why do you identify intelligence with algorithmic complexity, again?
> Eliezer pointed this out to me too. With unlimited computing power, the initial
> algorithmic complexity is irrelevant. You could prove anything that could be proven
> by running a simple program that enumerated all proofs. You could solve any
> problem that future humans could solve by simulating the universe with a
> 407 bit program for 2^407 steps, or even better, simulate all possible laws
> of physics with an even simpler program running for 2^814 steps.
> Of course we don't have unlimited computing power, so prior knowledge
> of the environment does help, as in greater expected utility. Perhaps
> Legg's proof of the absence of an elegant theory of learning is not the
> best argument, however. I should just say that greater intelligence by
> most definitions is correlated with faster learning, which means more
> accumulated knowledge over a fixed time period.
Heh. But you are only considering complexity of a counter, t, your
O(log t) related to the fact that many counter values are not
compressible. It has nothing to do with prior information about
environment, in fact there is no communication with environment in
your model. You also don't reflect the work on optimization of
intelligence, as you could run t^t^t^t steps instead of t steps for
given value of t and have the same result. There is no utility in
complexity, if said complexity is noise.
-- Vladimir Nesov email@example.com http://causalityrelay.wordpress.com/
This archive was generated by hypermail 2.1.5 : Tue Jun 18 2013 - 04:01:01 MDT