From: Arona Ndiaye (andiaye@chello.nl)
Date: Mon Apr 22 2002 - 23:38:21 MDT
Warm greetings from Masterdam,
----- Original Message -----
From: "Will Pearson" <w.pearson@mail.com>
To: <sl4@sysopmind.com>
Sent: Monday, April 22, 2002 9:09 PM
Subject: Re: Different View of IA
> Perhaps a simple faq with definitions of key term for new comers would be
useful and easier to update.
>
Check Gordon Worley's page... I'm under the
impression that it's quite accessible and was setup for that very purpose.
> > Similarly, there are constraints on which futures can be envisioned
under
> > certain background assumptions; it isn't all just magic. ("Magic" tends
to
> > occupy a certain balance between anthropomorphic characteristics and
minor
> > departures; a Singularity envisioned as magic will be too
anthropomorphic.)
>
> I don't understand the context of this statement. If you
Assuming a certain set of initial conditions, the future is not like a
lottery. The set of initial condition places limits on what type of future
is likely to 'happen'.
A Singularity envisioned as magic is a Singularity seen in human terms.
> Am I getting confused between the definition of a recursive self-improving
system and seed Ai, are they the same? Here is a question that will dispel
my confusion, would a seed AI given the goal of adding two numbers in the
most optimal way, change itself to a simple adding program? OR does the
definition of SeedAi require friendlyness or other goals. I am also under
the impression that the AI could choose to commit suicide(say if it knew it
would do more harm than good), which might means it might not grow to
transhuman levels, is this right? Is the need to go to transhuman levels
embeded in the goal? I was confused by reading Ben Goertzel's websites about
his own view of Ai, which doesn't specify the friendliness of the goal, but
talks about SeedAI. Whatever, in the future I shall say recursively
self-improving system, instead of seed ai. If that is okay?
ONE of the characteristics of a seed AI is that it is a self-improving
system. A seed AI given the goal of adding two numbers would NOT change
itself in an adding program. From that very question, I believe that you do
need to read a bit more background material...... I'll give you the same
answer I gave to Jean Nguyen:
I recommend reading ALL of the archives located at:
http://sysopmind.com/archive-sl4/
PS: Off course reading GISAI and CFAI located at:
http://www.intelligence.org/CFAI/index.html
&
http://www.intelligence.org/GISAI/index.html
should definitely address a lot of issues and answer a lot of questions.
PS: Gordon Worley's page is at:
http://homepage.mac.com/sing_rc/
<rest of message was deleted>
Most kind regards,
Arona Ndiaye.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT