From: Peter Voss (firstname.lastname@example.org)
Date: Fri May 03 2002 - 20:03:21 MDT
No, I think it is an inherent problem: One potentially non-contradictory set
of goals, or a multitude of conflicting values. What is worse, you can't
dream someone else's dreams for them, define their goals for them, or live
their lives for them - you can really only do that for yourself.
PS. Ants don't 'figure things out'.
From: email@example.com [mailto:firstname.lastname@example.org]On Behalf
Of Eliezer S. Yudkowsky
Sent: Friday, May 03, 2002 5:38 PM
Subject: Re: supergoal stability
Peter Voss wrote:
> Ben, you make a very good point. Figuring out what is good for yourself
> seems much easier than trying to balance the needs/ desires of everyone
Isn't this a special case of our being humans rather than ants or AIs?
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sat May 18 2013 - 04:00:25 MDT