From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Jul 14 2005 - 15:00:37 MDT
Russell Wallace wrote:
>
>>See the past SL4 discussion with the subject line "Darwinian dynamics unlikely
>>to apply to superintelligence".
>
> I did. My position is that a superintelligent AI of the operating
> system type _can_ (with a lot of skill and care) be designed so as not
> to undergo Darwinian evolution, and _needs_ to be so designed (to
> avoid the scenario I mentioned). IIRC, this was also your position in
> that discussion; correct me if I'm wrong?
No; I was advocating that a generic superintelligence would replicate its
control structure with sufficient fidelity to avoid heritable variations that
correlate with reproductive speed, that there wouldn't be frequent death to
free up resources hence no iterated generations, etc.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT