Re: Si definition of Friendliess

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Apr 04 2001 - 22:32:37 MDT


Samantha Atkins wrote:
>
> It is not at all clear that we require Sysop seed level AI in order to
> upgrade ourselves significantly and even upload ourselves. Therefore,
> if the Sysop considered such things "unfriendly" it would have to make
> us dumber.

Non sequitur. Why is uploading "unfriendly"? Even uploading to
independent hardware isn't intrinsically unFriendly; it's just something
that might turn out to be an unnecessary risk. (A lot of people seem to
think it's a necessary risk, or not a risk, hypotheses I think are
incorrect, but certainly imaginable.)

And if uploading to independent hardware is an unnecessary risk, there are
infinitely more friendly ways to prevent it than by nonconsensually
reducing someone's intelligence.

> Or perhaps it would conclude that you can be friendly to an arrogant,
> determinely stupid species and simultaneously preserve its free will and
> idenity.

I assume you meant "can't", but I actually like this sentence more. I see
absolutely no problem whatsoever with being friendly to an arrogant,
determinedly stupid species while simultaneously preserving their free
will. Why would this even be difficult?

> If so it would dump this paradoxical meaningless chore and go
> find something better (at least actually possible) to do.

Not necessarily. Ve might just fulfill whatever of the chore can be
fulfilled. "Something better" under what criterion?

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT