RE: Goalless SI

From: H C (lphege@hotmail.com)
Date: Wed May 25 2005 - 19:46:36 MDT


>On Tue, 2005-05-24 at 16:22 -0700, Eliezer S. Yudkowsky wrote:
>ornamentation and tinsel. I don't think humans could build an AI that had
>no goal system at all until it was already a superintelligence.
>

>Have you produced any papers or speeches that further clarify or
>validate this thought? Do you know of anyone else who has come to the
>same or similar conclusion independently?

>Richard Kowalski
>Be the Singularity

It seems to me that intelligence definitevly requires some subjective,
motivational quality, otherwise upon what basis does it act...?



This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:22:56 MST