Re: AI and survival instinct.

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Apr 01 2002 - 20:07:26 MST


Ben Goertzel wrote:
>
> Once you've read his essay, then you can join the ongoing debate Eliezer and
> I have been having on AI goal structures.... In brief: I think that
> Friendliness has to be one among many goals; he thinks it should be wired in
> as the ultimate supergoal in a would-be real AI.

Sigh. Ben, I would never "wire" anyone or anything unless I was willing to
wire myself the same way.

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT