From: Ben Goertzel (ben@goertzel.org)
Date: Thu Jun 27 2002 - 17:08:41 MDT
> What I'm saying is that you don't get Singularity tragedies - as
> opposed to
> ordinary military tragedies ("Friendly fire") - unless you're
> dealing with a
> transhuman AI. And if you're dealing with a transhuman AI then it is
> probably not relevant whether the AI is in immediate command of a
> tank unit;
> the AI must be Friendly.
I'm not so sure...
Humans have trans-gerbil intelligence, yet a violently inclined human is
more dangerous to gerbils than an equivalently intelligent
non-violently-inclined human -- even if the two humans have equally
"friendly" psychological attitudes toward gerbils...
-- Ben G
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT