Re: [SL4] Programmed morality

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri Jul 07 2000 - 01:24:18 MDT


petervoss1 wrote:
>
> I agree. However, there also has to be something more fundamental like
> 'pain/ pleasure' (ie. stop doing this/ do more of this)

"Tension," *maybe*. Pain and pleasure are still too complex.

> > ... I've heard arguments that say, once something is intelligent it
> > naturally does the 'right' thing. That's rather naive I think. You can have
> > intelligent dictators....
>
> Yes, this is important. There is no absolute 'platonic' right. 'Right' is
> always with respect to a specified or implied goal and beneficiary.

... as far as we know.

This is one of those cases where it's important to accept both possible
outcomes. (I'm not just saying that to be ostentatiously rational. I
really don't know.)

> > ... Asimov's Laws are equally naive...
>
> True. We can perhaps guide (bias) an AI's value system (like we can a
> child's), but cannot ultimately prevent it from being overridden/
> re-programmed.
[snip]
> I am very concerned about the risks of run-away AI (unlike Eli I *do* care
> what happens to me). I'm desperately searching for ways of trying to predict
> (with whatever limited certainty) what goal system an AI might choose. Any
> ideas?

I do care what happens to Peter Voss, and to the rest of humanity. But
to think of myself as being in opposition to the AI, even for humanity's
sake, would be to create a distinction between subject and object. It
would reduce every programmed instinct or taught piece of philosophy to
the fact that Eliezer tried to fool an AI and failed.

Mystical? Perhaps. I could be wrong. It's something that I'm still
thinking through. But I do think that the "instinct" theory of Asimov
Laws is still being anthropomorphic. Only a human sees "arguments" or
"instincts". An AI sees cause and effect. On some level, programming a
seed AI with an instinct ultimately bears no more information than the
flat statement "Peter Voss wants you to have this instinct".

-- 
        sentience@pobox.com    Eliezer S. Yudkowsky
              http://intelligence.org/beyond.html
------------------------------------------------------------------------
Keep in touch with eGroups, 
Keep your long distance bills lower with beMANY!
http://click.egroups.com/1/6614/12/_/626675/_/962955071/
------------------------------------------------------------------------


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT