Re: Coercive Transhuman Memes and Exponential Cults

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Jun 11 2001 - 13:25:51 MDT


Durant Schoon wrote:
>
> Imagine transhuman Bill becomes vastly more intelligent than person Alan. So
> much more that Bill can easily influence Alan to do things Alan might not want
> to do on his own.
>
> Fortunately, the Sysop is there to protect Alan by advising against actions
> which might have deleterious outcomes for him.

Actually, the Sysop is there to protect Alan by saying: "You have a
message from Bill which will take over your brain. I advise in the
strongest possible terms that you delete it without looking at it. Do you
want to delete it?"

> Now, imagine Charlie. Transhuman Charlie is sufficiently more intelligent than
> transhuman Bill. Again our favorite Sysop can warn Bill off from doing any
> thing he might not want to do, if only he knew all the consequences. But there
> is a new twist with this case. Transhuman Charlie might convince transhuman Bill
> to modify vis own volition so that accepting new memes from Charlie is highly
> desired. Transhuman Charlie might advertise his suggested memes cleverly so that
> transhuman Bill bill chooses them with no violation of vis volition.

Actually, transhuman Bill automatically flushes all messages from Charlie
in a single thought smoothly integrated with the Sysop API. Charlie can't
convince Bill to modify vis volition in the first place - at least, not
using invalid arguments. And if the arguments are valid from Bill's
perspective, even with the consequences taken into account, what's the
problem? Charlie has to be smarter than the *Sysop* to *covertly* take
over Bill, at least if Bill has a normal relationship with the Sysop.

> The result? Exponential cults win (in a weird way).

Only for people who like exponential cults...

> These seem to me to be the interesting kinds of problems that arise once you've
> chosen volition as the "be all, end all" metric of the universe (and yes, I
> currently subscribe to this view).

Hm, it looks to me like, in a volition-dominated Universe, only people who
want to be taken over by exponential cults wind up in them... isn't that
sort of the point?

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT