Re: Fighting UFAI

From: Robin Lee Powell (rlpowell@digitalkingdom.org)
Date: Thu Jul 14 2005 - 13:56:40 MDT


On Thu, Jul 14, 2005 at 03:28:17PM -0400, Ben Goertzel wrote:
>
>
> > > Eli, this clearly isn't true, and I think it's a
> > > poorly-thought-out statement on your part.
> > >
> > > For instance, consider
> > >
> > > Goal A: Maximize the entropy of the universe, as rapidly as
> > > possible.
> > >
> > > Goal B: Maximize the joy, freedom and growth potential of all
> > > sentient beings in the universe
> >
> > Saying "sentient beings" instead of "humanity" is a cop-out,
> > Ben. For our purposes, they are identical.
>
> Hmmm...
>
> Well, I think that when Eliezer said "humanity" he probably really
> meant "humanity." So I won't take your reply as a proxy for
> his...
>
> How about
>
> Goal C: Migrate to the Andromeda galaxy and use all the
> mass-energy there to advance mathematics, science and technology
> as far as possible; but leave the Milky Way galaxy alone, ensuring
> that it evolves over time very similarly to what would happen if
> no superhuman AI ever existed.
>
> This goal also doesn't mention humanity explicitly, yet seems much
> less dangerous than goal A.

(yes, I know this is a reversal:)

Less dangerous to us, but what about all the sentient beings in the
Andromeda galaxy? :-)

> Of course, you could argue that the Milky Way here is serving as a
> proxy for humanity; but, for sure, humanity is not being
> explicitly mentioned...

True.

-Robin

-- 
http://www.digitalkingdom.org/~rlpowell/ *** http://www.lojban.org/
Reason #237 To Learn Lojban: "Homonyms: Their Grate!"
Proud Supporter of the Singularity Institute - http://intelligence.org/


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT