Re: Seed AI (was: How hard a Singularity?)

From: Eugen Leitl (eugen@leitl.org)
Date: Mon Jun 24 2002 - 23:49:55 MDT


On 24 Jun 2002, James Rogers wrote:

> I don't usually consider something that is an example of "just right"
> engineering to be "complex", except perhaps in a pedestrian sense. For
> me, "elegant" and "complex" are at the opposite ends of the descriptive
> spectrum when talking about software.

The world is complex, at least if one wants to survive in it. Presence of
life tends to make it even more complex, especially if it's intelligent
life. A useful AI needs to be performance competitive with instances of
life. Directed self-improvement requires at least human intelligence
level. (Stochastically driven improvement has much lower requirements).

Somewhere you need to encode that complexity. The distinction between
hardware and software is rather meaningless (we can safely assume a quite
simple transformation on a pattern will do), but the pattern must contain
a lot of assumptions about how the world is structured in order to be able
to start extracting knowledge from it.

Biology starting with the fertilized egg is deceptive, because the human
baby is the result of a trapdoor function on the zygote. The inverse
function is never computed, since evolution is Darwinian, not Lamarckian.
Since we can't emulate life's morphogenesis, we can't use the genome for
compact encoding. If we look at a baby, we see lots of dirty, but
irreducible complexity. A naturally intelligent AI seed must contain a lot
of that complexity, in order to be able to abstract knowledge from the
world.

I'm a bit at loss why most people here think you can just sit down, think
briefly, introspect a little, and write down that very large vector. This
is a lot like taking a core dump on the world. Without a lot of trial and
error, what is the source of that knowledge?



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT