Re: How big is an FAI solution? (was Re: [sl4] to-do list for strong, nice AI)

From: Tim Freeman (tim@fungible.com)
Date: Thu Oct 22 2009 - 05:35:39 MDT


From: Matt Mahoney <matmahoney@yahoo.com>
>That's a guess. I think the training data is 10^17 bits.
>
>Also, your design has the same shortcoming as CEV and my cheating
>ideal-market definition. It doesn't define "human".

The purpose of the training data is to define "human" and to define
what voluntary actions these humans are taking and what perceptions
they are experiencing.

Humans seem to be able to do this without any deep thought, so an
ideal extrapolator shouldn't need an absurdly large amount of training
data to learn to do it.

One might argue that humans have a large amount of state information
in their heads, but seriously, how much of that do you think pertains
to being able to make statements like "Joe went to the far side of the
room, saw the chair, and moved it with his left hand" in reaction to
seeing that happen? Nearly everybody can do that sort of thing, and
they don't feel like they're doing any deep thought at the time.

-- 
Tim Freeman               http://www.fungible.com           tim@fungible.com


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:05 MDT