From: Phil Goetz (philgoetz@yahoo.com)
Date: Fri Jul 22 2005 - 15:17:03 MDT
> Well, if you want to interpret his statement that way,
> you at least need to
> make the exception "Except for transhumans who were
> created (or who advanced
> to transhumanity) via a process that has been proven to
> have a very high
> odds of leading to a Friendly transhuman."
This is my fault for jumping into the middle of SL4.
I suppose this is in the context of SL4 having had many
long
discussions on whether such processes exist, whereas I am
still coming from the view that being significantly more
intelligent than humans will entail
[(1a) not placing the
pleasure of ordinary humans at the very top of your
goals list, or (1b) having very different opinions of
what is good for humanity than humanity does],
and
(2) being very good at achieving the things at the very
top of your goals list when competing with humans.
Combine that with a Darwinian / Adam-Smithian view of
the universe, and it's hard to see how anything worthy of
the name "transhuman" could operate in a way that humans
would not interpret as hostile.
I think you would have
to be talking about something much smarter than humans
in some ways, but more limited than humans in other ways
that helps ensure its safety to humans. Do we have a
word for that?
- Phil
____________________________________________________
Start your day with Yahoo! - make it your home page
http://www.yahoo.com/r/hs
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT