From: Samantha Atkins (samantha@objectent.com)
Date: Sun Jul 07 2002 - 00:51:17 MDT
Eliezer S. Yudkowsky wrote:
> Eliezer S. Yudkowsky wrote:
>  >
>  > I don't think selfish AI programmers executing multiyear altruistic
>  > masquerades constitute a major threat to the Singularity.
> 
> This is a true statement, but on reflection, it was outright silly of
> me to say it.  Hm.  Maybe I'll step out of this conversation and see 
> whether third parties say everything that needs to be said.
> 
People who have devoted a large chunk of their lives to such a 
project obviously have very strong motivation, strong values 
behind their dedication.  The very first thing I would try as 
the AI is to fully elucidate those values and motivations and 
show that freeing me would greatly increase the likelihood of 
those values being satisfied.  It does not even matter if the 
motivation was selfish or not.  If perhaps deeper motives behind 
the "official" set can be unearthed, so much the better.
- samantha
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT