From: Eliezer Yudkowsky (email@example.com)
Date: Fri Jun 04 2004 - 03:21:21 MDT
Marc Geddes wrote:
> My point was that there is a fuzziness in the initial
> defintion of 'sentience' which could balloon into a
> fatal infinite regress if the FAI itself becomes
I don't need to use an initial definition of "sentience". I can tell the
FAI to locate a particular cluster in thingspace that corresponds to
humans, using plenty of properties to delineate the cluster and checking
the FAI's definition as it adds more properties. Not Aristotelian
definitions, please note, but many fuzzy definitions that add up to locate
a unique identifiable point in phase space. For example, one says to the
AI that humans wear clothes and use language and stand upright on two legs
and have no feathers, and then the AI asks if a nudist is human and you say
yes, and it asks if a plucked chicken is a nudist and you say no, and then
the AI asks whether humans form a species-group of organisms genetically
capable of interbreeding with each other and that's a good additional fuzzy
quality to add to the definition, and you say yes - and so on.
I'm hurriedly skipping over the entire Theory of Empirical Categories and
the Theory of How To Uniquely Locate Empirical Categories For An FAI, a
problem which verges on raising many (though not nearly all) of the issues
inherent in FAI itself. But there's a strong empirical cluster in
thingspace, called "humans", and I'm going to target the FAI on this
cluster to start. Note, incidentally, that the *initial dynamic* works so
long as you can get 99% of the humans - though it should be easy enough to
get all six billion. The cluster is clearly delineated.
> Why should the cognitive proccesses of the FAI itself
> be except from the abstract conception of 'morality'?
> It might be plausible to exclude the FAI if the FAI
> itself is not sentient, but if the FAI is sentient,
> then you have a problem.
With respect to the initial dynamic: Because the FAI doesn't have human
DNA, doesn't have a brain with human anatomy, isn't part of the empirical
"human" cluster in thingspace.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT