From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Feb 12 2001 - 16:24:29 MST
John Smart wrote:
>
> Hi All,
>
> As Eli and several of you know, I'm writing a book on the singularity which
> includes quite a few references on the issues of stability, morality, and
> interdependence in emerging computational complexity, independent of
> substrate. I feel that each of these three issues can be independently
> analyzed,
No.
What?
I must be enormously misinterpreting what you meant by that. You are one
of the last people in the world whom I would guess would take that
position.
> and consider them to be the heart of the Friendly AI question.
Yes.
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT