Re: Self-modifying FAI (was: How hard a Singularity?)

From: James Higgins (jameshiggins@earthlink.net)
Date: Wed Jun 26 2002 - 12:28:53 MDT


At 12:47 PM 6/26/2002 -0400, Eliezer S. Yudkowsky wrote:
>James Higgins wrote:
>>much greater knowledge and intelligence what we attribute to friendly
>>behavior may end up looking quite different.
>>
>>Your definition of ethics is a good example. If an alien landed tomorrow
>>and the first person it met was a fantastic salesman, the salesman may
>>appear to be exceedingly friendly. When in fact their only goal is to
>>open up a new trade route and they don't in fact care one iota about the
>>alien, only the result! ;)
>
>Which problem are we discussing here? The idea that a hostile AI could
>deliberately lie in order to masquerade as Friendly? Or the assertion
>that a Friendship programming team would wind up with a hostile AI that
>appears as Friendly because the specification was ambiguous? These
>problems are very different structurally!

I was trying to provide a concrete example of where observed "friendliness"
may, if fact, not be friendly at all. I used an Alien because pretty much
everyone on earth understands how Sales people work. There are more subtle
examples of this out there, but I figured this would be less ambiguous.

>>We may *think* we are defining friendliness via external reference points
>>but actually be defining only the appearance of friendliness or something
>>similar. Thus the SI would only need to appear friendly to us, even
>>while it was planning to turn the planet into computing resources.
>
>That's why you discuss (anchor externally) the *reasons* for decisions,
>not just the decision outputs. You aren't anchoring the final output of
>the causal system, you're anchoring *all* the nodes in the system.

And we just hop

>>To use Eliezer's method, while I may not be correct I'm quite certain you
>>are wrong. (Does that make me an honorary Friendship Programmer?)
>
>No, that makes you a perfectionist by successive approximation. But
>asking structural questions about metawish construction scores
>points. (Whee! Eugenese!)

Well, no argument there, I am a perfectionist. Not a bad trait to have,
IMHO, when dealing with such things as a Singularity...

James Higgins



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT