From: Eliezer S. Yudkowsky (firstname.lastname@example.org)
Date: Sat Jul 23 2005 - 13:57:19 MDT
Russell Wallace wrote:
> On 7/23/05, Eliezer S. Yudkowsky <email@example.com> wrote:
>>Russell proposed letting the AI programmers hardcode, for all future
>>civilization until the end of time, a definition of adulthood. I don't think
>>that's a very good solution.
> Where the hell did you get that from? Not only did I not propose that,
> I explicitly clarified that I was not proposing it.
"For purposes of setting up the domains, the rule can be simple: each
and every human on Earth (the ones who are old enough to make a
choice, at least) gets to decide what domain they want to move to (or
stay on Earth, of course); that's an operationally adequate definition
of 'sentient' for that purpose."
So the programmers hardcode who's old enough to move between domains. The
individual domains can make their own definitions for internal legal systems,
perhaps, but there would appear to be an eternally unmodifiable and hardcoded
underlayer. An underlayer that you would trust to, e.g., prevent the Marquis
de Sade from setting up a domain in which he can create sentient beings and
torture them with no chance of escape, because you would have correctly
hardcoded an exit policy applicable to all adult sentient beings.
Incidentally, who decides which domain a six-year-old orphan goes to? Please
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Tue Jun 18 2013 - 04:00:46 MDT