From: Gordon Worley (redbird@rbisland.cx)
Date: Mon Jun 17 2002 - 09:04:24 MDT
On Monday, June 17, 2002, at 10:39 AM, Eugen Leitl wrote:
>> something to be taken lightly and done on a whim. However, if we had a
>> FAI that was really Friendly and it said "Gordon, believe me, the only
>> way is to kill this person", I would trust in the much wiser SI.
>
> Yeah, that's some really Friendly AI. "Trust me, I'm a FAI! Kill that
> person, Gordon!"
An FAI most likely would not say that such a thing had to be done. At
the same time, I'm not an SI, so there is the possibility that it might
want to do such a thing even if right now we don't see that as something
that it would. I hope that an FAI doesn't want to go around killing
people and you're right, it's probably not really Friendly if it's
asking to kill. But, it's not an impossible situation that might occur.
Unless we get into some extreme case, even if I arrive at my conclusions
differently, I think that many of us would have the same conclusions.
-- Gordon Worley `When I use a word,' Humpty Dumpty http://www.rbisland.cx/ said, `it means just what I choose redbird@rbisland.cx it to mean--neither more nor less.' PGP: 0xBBD3B003 --Lewis Carroll
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT