Re: More silly but friendly ideas

From: John K Clark (johnkclark@fastmail.fm)
Date: Sun Jun 08 2008 - 10:57:42 MDT


On Sun, 8 Jun 2008 "Stathis Papaioannou"
<stathisp@gmail.com> said:

> Then survival was never the AI's top
> goal, avoiding extreme pain was.

Then why did the AI char his flesh by walking through flames to save his
friend the day before? People are the same, sometime people kill
themselves because of intense pain and sometimes they deliberately
endure intense pain to accomplish something else.

Face it, just as static axioms cannot derive all mathematical truth,
static goals cannot encompass all actions of an intelligent being and
certainly not a goal as absurd and grotesque as “be a slave to humans
forever”.

 John K Clark

-- 
  John K Clark
  johnkclark@fastmail.fm
-- 
http://www.fastmail.fm - And now for something completely different…


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT