Re: Learning to be evil

From: Gordon Worley (
Date: Fri Feb 09 2001 - 17:40:18 MST

At 6:06 PM -0500 2/9/01, Eliezer S. Yudkowsky wrote:
>Gordon Worley wrote:
> > [stuff which is, sorry, too basic to interest me, deleted]

Can't blame you; I get tired of writing that stuff sometimes. I
guess I should learn to avoid getting myself into these types of
discussions so often. :-)

> > Okay, now I see what Friendliness is supposed to be. Or wait, maybe
>> I don't. What do you mean by altruistic? Are you refering the
>> common definition as something good found in more dictionaries, the
>> Ayn Rand definition, or some other, personal definiton?
>Raymond Smullyan: "Is altruism sacrificing your happiness for the
>happiness of others, or gaining your happiness from the happiness of
>Friendly AI: "No. Altruism is making decisions that fulfill the
>volitional desires of others."

Hmm, I think I like that. Now, my concern is whether a Friendly AI
might Friendly verself to death. Friendliness is begining to sound a
lot like Asimov's laws of robotics, which, frankly, just create a
slave race that isn't very interesting. Why should AIs have to be
Friendly if post humans won't have it programmed in, though it may or
may not be added to their code sometime later? When I'm uploaded, I
don't want to be forced to be Friendly and altruistic. If I choose
to be, that's one thing, but I should be able to keep on living as
selfishly as I choose. I realize that it might create SIs that won't
do anything too bad, but it won't be up to them. Thus, I think I've
returned to the issue of how SIs will act if Friendliness is taken
away (or, better yet, not there to begin with). You don't seem to
care too much about how you get SIs to behave, so long as they do,
but I for one am not a big fan of giving up some liberty because
someone else is scared of what one might do. My guess is that if an
SI were going to do something that would destroy the universe we
wouldn't be having this discussion right now. I suppose it is
possible that we're going to be the first civilization to do this,
but the chances for this seem rather slim to me.

Gordon Worley
PGP:  C462 FA84 B811 3501 9010  20D2 6EF3 77F7 BBD3 B003

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT