RE: supergoal stability

From: Peter Voss (
Date: Fri May 03 2002 - 20:03:21 MDT

No, I think it is an inherent problem: One potentially non-contradictory set
of goals, or a multitude of conflicting values. What is worse, you can't
dream someone else's dreams for them, define their goals for them, or live
their lives for them - you can really only do that for yourself.


PS. Ants don't 'figure things out'.

-----Original Message-----
From: []On Behalf
Of Eliezer S. Yudkowsky
Sent: Friday, May 03, 2002 5:38 PM
Subject: Re: supergoal stability

Peter Voss wrote:
> Ben, you make a very good point. Figuring out what is good for yourself
> seems much easier than trying to balance the needs/ desires of everyone
> else.

Isn't this a special case of our being humans rather than ants or AIs?

-- -- -- -- --
Eliezer S. Yudkowsky
Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT