Re: Self-modifying FAI (was: How hard a Singularity?)

From: Eliezer S. Yudkowsky (
Date: Wed Jun 26 2002 - 11:51:52 MDT

Eliezer S. Yudkowsky wrote:
> Growing up and becoming independent of the programmers is
> something that depends on understanding *what* the roots are, but this
> is an advanced stage of Friendship growth, which may assume that the AI
> understands the programmers as black boxes.

Argh. This should read as "understands the programmers as more than black

Eliezer S. Yudkowsky                
Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT