From: Matt Mahoney (matmahoney@yahoo.com)
Date: Sun Feb 22 2009 - 13:42:46 MST
--- On Sun, 2/22/09, Johnicholas Hines <johnicholas.hines@gmail.com> wrote:
> Holographic AGI means you can't examine the structure of the AGI and
> predict how it will behave. This is risky.
Unfortunately it is a necessary property of any system that has greater algorithmic complexity than you do (beyond a small language-dependent constant, for those who want to nitpick about the math). You can't simulate (and therefore can't predict) what a system will do without knowing everything it knows.
And yes, I realize that algorithmic complexity has nothing to do with intelligence, at least in the sense of simple yet universally intelligent systems like AIXI. But until we come up with universally intelligent algorithms that are faster than exponential, then in practice it does matter.
Practical intelligence (as measured by rate of utility gain on problems of practical importance) = speed + memory + I/O + knowledge. Kasparov had more chess knowledge than Deep Blue, but Deep Blue was faster. AIXI^tl will compress a file smaller than zip, but zip will finish first. If P knows everything Q knows plus more, but are otherwise equal, then in a practical sense we say that P is smarter than Q.
-- Matt Mahoney, matmahoney@yahoo.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:04 MDT