From: Rafal Smigrodzki (rms2g@virginia.edu)
Date: Tue Jul 16 2002 - 18:37:11 MDT
Eliezer S. Yudkowsky wrote:
Because their morality calls for a respect for human destiny which is
incompatible with the hijacking of that destiny by any individual, or
group, or for that matter transient civilization? Because their basic
philosophy of morality holds that morality is something that exists
beyond the wishful thinking of any one individual, grounding in humanity
or better yet sentient life or better yet the nature of reality itself,
and therefore even the programmer's own morality should strive to free
itself from sensitivity to initial conditions?
### Your words above seem to be a restatement of my following sentence:
> FAI is programmer-sensitive, even if only by the
> six-degree-of-separation expedient of panhuman (or even pansentient)
> motivation analysis.
-------
If an AI comes to the conclusion that 2 + 2 = 4, it may be dependent on
the fact of the programmer having built an AI, and even dependent on the
programmer having built an AI that tries to come to
programmer-insensitive conclusions about arithmetic, but this programmer
dependency is not a programmer sensitivity.
### I failed to see the difference between "programmer sensitivity" and
"programmer dependence".
Rafal
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT