From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Feb 02 2004 - 11:54:40 MST
Keith Henson wrote:
>>
>> However, when thinking about the ethics of posthuman digital minds,
>> evolutionary ethics probably aren't so relevant, because these minds
>> won't be evolving by Darwinian natural selection.
>
> I am not so sure. The scale may be off in space and/or time, but it
> seems to me that posthuman digital minds are going to be faced with
> much the same kinds of problems of limited resources human groups were
> faced with. Being able to directly change their nature might not make
> a lot of difference in the long term about the kinds of ethics they
> adopt.
See http://sl4.org/archive/0401/7483.html
("Darwinian dynamics unlikely to apply to superintelligence")
A more compact restatement of the above was as follows:
Eliezer Yudkowsky wrote:
> Perry E. Metzger wrote:
>>
>> That's like trying to get rid of gravitation. So long as there are
>> limited resources and multiple competing actors capable of passing on
>> characteristics, you have selection pressure...
>
> No, so long as you have limited resources
> AND frequent death to free up resources
> AND multiple competing phenotypes with heritable characteristics
> AND substantial variation in those characteristics
> AND substantial variation in reproductive fitness
> AND correlation between heritable characteristics and fitness
> AND this is iterated for many generations
> THEN you have a noticeable amount of selection pressure
I would also add that the heritable characteristics must exhibit long-term
fidelity over the number of generations being considered.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:45 MDT