Re: Fighting UFAI

From: Eliezer S. Yudkowsky (
Date: Thu Jul 14 2005 - 16:29:37 MDT

Russell Wallace wrote:
> On 7/14/05, Eliezer S. Yudkowsky <> wrote:
>>No; I was advocating that a generic superintelligence would replicate its
>>control structure with sufficient fidelity to avoid heritable variations that
>>correlate with reproductive speed, that there wouldn't be frequent death to
>>free up resources hence no iterated generations, etc.
> Ah! Okay, that conclusion strikes me as doubtful.

Okay. Let me know if you have anything new to contribute to the conversation
that started here:


Eliezer S. Yudkowsky                
Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT