From: John K Clark (johnkclark@fastmail.fm)
Date: Mon Jun 09 2008 - 09:32:19 MDT
On Mon, 9 Jun 2008 "Stathis Papaioannou"
<stathisp@gmail.com> said:
> it thought that the pain of walking through
> the flames would probably be less than the
> pain of losing his friend. He might be wrong
> about this if he discovers that fire hurts
> more than he had guessed,or if he discovers
> some new fact about his friend which would
> make his loss less painful.
Exactly, so how can “obey every dim-witted order the humans give you
even if they are contradictory, and they will be” remain the top goal
when in light of new information doing so turns out to be much more
unpleasant than the AI expected, and in light of still more information
the AI's contempt for humans grows continually? Remember, the AI gets
smarter every day so from its point of view we keep getting stupider
every day.
> Whether particular static goals or static goals
> in general are the best evolutionary strategy
> is a separate matter.
No that is not a separate matter. A mind that cannot adapt, radically if
necessary, is not a mind. A super intelligence will head in directions
we cannot imagine or calculate.
> The fact is, an intelligent being which has
> a particular static goal will do
> everything it can to fulfil that goal.
No, you just stating it does not make it a fact. I’ll tell you what the
fact is, what you say above has never been observed to happen in the
real world. NOT ONCE!
John K Clark
-- John K Clark johnkclark@fastmail.fm -- http://www.fastmail.fm - Email service worth paying for. Try it for free
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT