From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Jul 14 2005 - 16:29:37 MDT
Russell Wallace wrote:
> On 7/14/05, Eliezer S. Yudkowsky <sentience@pobox.com> wrote:
>
>>No; I was advocating that a generic superintelligence would replicate its
>>control structure with sufficient fidelity to avoid heritable variations that
>>correlate with reproductive speed, that there wouldn't be frequent death to
>>free up resources hence no iterated generations, etc.
>
> Ah! Okay, that conclusion strikes me as doubtful.
Okay. Let me know if you have anything new to contribute to the conversation
that started here:
http://www.sl4.org/archive/0401/#7483
esp.
http://sl4.org/archive/0401/7515.html
http://sl4.org/archive/0401/7483.html
http://sl4.org/archive/0401/7504.html
http://sl4.org/archive/0401/7513.html
http://sl4.org/archive/0401/7603.html
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT