Re: When Subgoals Attack

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Dec 13 2000 - 15:36:57 MST


Durant Schoon wrote:
>
> I might be reading too much into Eliezer's reply, but his solution
> sounds to me like: "You can't trust any smart subjects, so don't have
> any. Only create dumb automotons to follow out your orders exactly
> (the details of which, you have considered extremely carefully)."

Remember, I don't think in terms of observer-biased goal systems. So my
reply is not "Create only dumb automatons", but rather "Create only
subjects which share your own supergoals, regardless of which subgoal they
are tasked to implement."

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT