From: Samantha Atkins (samantha@objectent.com)
Date: Wed Apr 30 2003 - 00:59:03 MDT
Eliezer S. Yudkowsky wrote:
> Mike Williams wrote:
>
>>> If, depending on your choice, one person
>>> died or a million people died, you'd choose so that only one person
>>> died,
>>> right? You wouldn't say: "Well, the death event exists either way."
>>
>>
>> This brings up a question that's been nagging at me for a while.
>> Would an
>> FAI make this kind of decision? Assume that the FAI is mature and in
>> control of earth's resources.
>> 1) If it can act to save a million people by sacrificing one person,
>> would
>> it do that?
>> 2) If so, then if it could save a million people by sacrificing 999,999
>> people, would it do that?
>
>
> It's very hard to see a situation where a mature FAI would be faced with
> that decision. And my own impulse is to reply: "Of course it would."
> The human injunction that 'the ends do not justify the means' guards
> against our fallibility and our warped political emotions. Change that,
> and what's left is only the lives.
>
Only the lives? Is the implication that the FAI is infallible
really serious?
> But perhaps an FAI would say differently. I can also see an
> irreplaceable value in an FAI not killing anyone, ever, throughout the
> whole of human history. I'm just not sure that value is greater than
> the value of a human life.
>
Taking a human life to save a human life is supposed to be a net
gain? It is at times for fairly limited beings without a lot
of options. But if a FAI can upload people and/or backup and
restore people I see no reason at all it would need to
involuntarily completely terminate and erase anyone. It might
need to take some of them offline for a time or move them to a
more appropriate (copacetic, advantageous) environment.
- samantha
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT