From: Stathis Papaioannou (firstname.lastname@example.org)
Date: Sun Jun 08 2008 - 21:15:42 MDT
2008/6/9 John K Clark <email@example.com>:
> On Sun, 8 Jun 2008 "Stathis Papaioannou"
> <firstname.lastname@example.org> said:
>> Then survival was never the AI's top
>> goal, avoiding extreme pain was.
> Then why did the AI char his flesh by walking through flames to save his
> friend the day before?
Because it thought that the pain of walking through the flames would
probably be less than the pain of losing his friend. He might be wrong
about this if he discovers that fire hurts more than he had guessed,
or if he discovers some new fact about his friend which would make his
loss less painful. In these cases, he would adjust his behaviour so
that it was in keeping with his original goals. The goals themselves
would not change.
> Face it, just as static axioms cannot derive all mathematical truth,
> static goals cannot encompass all actions of an intelligent being and
> certainly not a goal as absurd and grotesque as "be a slave to humans
Whether particular static goals or static goals in general are the
best evolutionary strategy is a separate matter. The fact is, an
intelligent being which has a particular static goal will do
everything it can to fulfil that goal. This could even lead to its own
destruction unless survival trumps every other goal.
-- Stathis Papaioannou
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT