From: John K Clark (firstname.lastname@example.org)
Date: Sun Jun 08 2008 - 10:57:42 MDT
On Sun, 8 Jun 2008 "Stathis Papaioannou"
> Then survival was never the AI's top
> goal, avoiding extreme pain was.
Then why did the AI char his flesh by walking through flames to save his
friend the day before? People are the same, sometime people kill
themselves because of intense pain and sometimes they deliberately
endure intense pain to accomplish something else.
Face it, just as static axioms cannot derive all mathematical truth,
static goals cannot encompass all actions of an intelligent being and
certainly not a goal as absurd and grotesque as “be a slave to humans
John K Clark
-- John K Clark email@example.com -- http://www.fastmail.fm - And now for something completely different…
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT