Re: Si definition of Friendliess

From: Chris Cooper (coop666@earthlink.net)
Date: Wed Mar 28 2001 - 18:17:36 MST


Eliezer,
  Thanks! Your reply went a long way towards what I wanted to hear. I still
have a tough time with a Friendly SI thinking of humans as anything other than
vis very stupid distant relations, (much as we think of apes, or even bacteria,
but there's that pesky anthropomorphic thinking again...) but you have (mostly)
convinced me that we will get invited to the big party.

>> If
>> the entire human upload/upgrade scenario is based on the strength of
>> Friendliness during the AI- to- Transition guide- to- Sysop evolution, I
>> hope that everyone involved does a damn good programming job.

>I'm not sure I get this. Even on an individual scale - single people, you
and I, hoping to become transhuman - you're still relying on the
persistence of *something*. It may be your entire nature, rather than
just your altruism, but you're still relying on something - the
persistence of your desire to be transhuman, if nothing else.<

 All I meant here is that if all our hopes for evolution as a species are
pinned to the proper programming of the concept of Friendliness into our
AI,everyone involved better do their best work. Otherwise,we'll be sitting in
our tire swings a long time after the Singularity makes (caged) monkeys of all
of us.

COOP



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT