Re: SIAI's flawed friendliness analysis

From: Barkley Vowk (
Date: Thu May 29 2003 - 15:54:12 MDT

I think they have a word for what your condition, If I'm not mistaken it
is "paranoia". Some people worry about actual things coming to get them,
you've created a little dream world where future bad things are coming to
get you, you have a gifted imagination.

Barkley C. Vowk -- Systems Analyst -- University of Alberta
Math Sciences Department -
Office: CAB642A, 780-492-4064

Opinions expressed are the responsibility of the author and
may not reflect the opinions of others or reality.

On Thu, 29 May 2003, Eliezer S. Yudkowsky wrote:

> Ben Goertzel wrote:
> >
> > I think that Eliezer and Bill are interpreting the term "human
> > happiness" differently. I think Eliezer is assuming a simple
> > pleasure-gratification definition, whereas Bill means something more
> > complex. I suspect Bill's definition of human happiness might not be
> > fulfilled by a Humanoids-style scenario where all humans are pumped up
> > with euphoride, for example ;-)
> >
> > I'm not necessarily taking Bill's side here -- I don't think that "human
> > happiness" in any reasonable definition is going to be the best
> > supergoal for an AGI -- but, I suspect Bill's proposal is less absurd
> > than it seems at first glance because of his nonobvious definition of
> > "happiness"
> "Happiness in human facial expressions, voices and body language, as
> trained by human behavior experts".
> Not only does this one get satisfied by euphoride, it gets satisfied by
> quintillions of tiny little micromachined mannequins. Of course, it will
> appear to work for as long as the AI does not have the physical ability to
> replace humans with tiny little mannequins, or for as long as the AI
> calculates it cannot win such a battle once begun. A nice, invisible,
> silent kill.
> If you want an image of the future, imagine a picture of a boot stamping
> on a picture of a face forever, and remember that it is forever.
> --
> Eliezer S. Yudkowsky
> Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT