From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Apr 29 2003 - 21:13:18 MDT
Mike Williams wrote:
>>If, depending on your choice, one person
>>died or a million people died, you'd choose so that only one person died,
>>right? You wouldn't say: "Well, the death event exists either way."
>
> This brings up a question that's been nagging at me for a while. Would an
> FAI make this kind of decision? Assume that the FAI is mature and in
> control of earth's resources.
> 1) If it can act to save a million people by sacrificing one person, would
> it do that?
> 2) If so, then if it could save a million people by sacrificing 999,999
> people, would it do that?
It's very hard to see a situation where a mature FAI would be faced with
that decision. And my own impulse is to reply: "Of course it would."
The human injunction that 'the ends do not justify the means' guards
against our fallibility and our warped political emotions. Change that,
and what's left is only the lives.
But perhaps an FAI would say differently. I can also see an irreplaceable
value in an FAI not killing anyone, ever, throughout the whole of human
history. I'm just not sure that value is greater than the value of a
human life.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT