From: 1Arcturus (arcturus12453@yahoo.com)
Date: Mon Dec 12 2005 - 12:28:32 MST
Patrick,
   
  Thanks. What date did Mr. Yudkowsky rescind that position? Is there a post or statement to that effect somewhere?
  According to the homepage, SIAI was founded in 2000. Was that the SIAI position at the time, and until the rescinding?
  I mean, I can guess at the logic that would lead to such a conclusion, but it seems a rather dramatic departure from any prima facie notion of 'friendliness.'
   
  gej
pdugan <pdugan@vt.edu> wrote:
  I believe Yudkowsky has fowarded that opinion in a prior era, back around 
2000, and has since rescinded his position to a much Friendlier stance that 
such an SAI would probably have "bad" i.e. unfriendly programming at its 
deeper structure. In other words just because an entity is technically 
"smarter" than you by however many orders of magnitude does not make said 
entity's statements the word of god, nor its actions ethically justified.
Patrick
>===== Original Message From 1Arcturus =====
>Someone on the wta-list recently posted an opinion that he attribtuted to Mr. 
Yudkowsky, something to the effect that if a superintelligence should order 
all humans to die, then all humans should die.
> Is that a wild misrepresentation, and like nothing that Mr. Yudkowsky has 
ever said?
> Or is it in fact his opinion, and that of SIAI?
> Just curious...
>
> gej
>
>
>---------------------------------
>Yahoo! Shopping
> Find Great Deals on Holiday Gifts at Yahoo! Shopping
  
                        
---------------------------------
Yahoo! Shopping
 Find Great Deals on Holiday Gifts at Yahoo! Shopping 
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:54 MDT