Re: Collapsarity

From: Samantha Atkins (samantha@objectent.com)
Date: Mon Mar 31 2003 - 03:05:01 MST


Eliezer S. Yudkowsky wrote:

> There is a very simple test that can detect a Friendly SI in your
> vicinity. Hold out your hand and say: "I want a banana." If you don't
> get a banana, there are no Friendly SIs capable of helping you. Note that
> you should be careful not to conduct this test in the presence of a human
> altruist if there is a banana nearby, as the "banana test" will then
> produce a false positive.
>
> There is a vast amount of rationalized literature attempting to defend
> various religions from the obvious and perfectly straightforward
> conclusion that any moral God would intervene to stop the tragedies of our
> world. These speculations percolate into the consciousness of many
> vaguely spiritual people and predispose them to rationalize elaborate
> reasons for a Friendly SI allowing medieval torture chambers, or for that
> matter modern torture chambers. I think such explanations are simply
> bull; they are blatantly rationalized, blatantly inconsistent arguments
> concocted to explain an assumption which is simply false. No, it is *not*
> moral for a Friendly SI, or any kind of moral God, to let people suffer
> and die. There simply aren't any Friendly SIs here, and we have
> absolutely no legitimate reason to dream up these elaborate explanations
> for why suffering is good.

Wait just a second. You do support the right of sentients to
self-determine including the right to tell the Friendly AI to
stay out of their affairs unless they ask for its help, I
believe. If so then some suffering is perfectly consistent with
a Friendly AI as such. The question then becomes what happens
when the sentient does ask for an end to their suffering. I am
not at all sure that it would be in the sentients best interest
and thus truly friendly for the FAI to simply fix anything and
everything in the sentients space or nature that led to the
suffering. Remember that the cause of much suffering of a
sentient is due to internal characteristics, beliefs,
programming, whatever of said sentient. To simply remove/change
all of those immediately would likely damage the identity matrix
of the sentient and/or have many unforeseen (by the sentient)
consequences not desired. So again, it is not at all obvious
that the FAI would remove all suffering. Medieval torture
chamber yes, rewiring brains to not be instrumental in their own
suffering? I have strong doubts that would be unambiguously moral.

- samantha



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT