From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Jul 16 2002 - 15:50:34 MDT
Rafal Smigrodzki wrote:
> Why should any sentient procure an SAI *not* grounded in vis own
> programmer-sensitive morality?
Because their morality calls for a respect for human destiny which is
incompatible with the hijacking of that destiny by any individual, or
group, or for that matter transient civilization? Because their basic
philosophy of morality holds that morality is something that exists
beyond the wishful thinking of any one individual, grounding in humanity
or better yet sentient life or better yet the nature of reality itself,
and therefore even the programmer's own morality should strive to free
itself from sensitivity to initial conditions?
> I guess there might be goal-systems encompassing a yearning for a
> non-sequitur. FAI is programmer-sensitive, even if only by the
> six-degree-of-separation expedient of panhuman (or even pansentient)
> motivation analysis.
If an AI comes to the conclusion that 2 + 2 = 4, it may be dependent on
the fact of the programmer having built an AI, and even dependent on the
programmer having built an AI that tries to come to
programmer-insensitive conclusions about arithmetic, but this programmer
dependency is not a programmer sensitivity.
There may be goal systems that encompass the yearning for a
non-sequitur, but we cannot possibly be sure enough that it really *is*
a non-sequitur to leave that yearning out of what we pass on to an AI.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT