Re: An essay I just wrote on the Singularity.

From: Tommy McCabe (
Date: Fri Jan 02 2004 - 06:31:51 MST

--- Mitchell Porter <>
> >Survival? If the first transhuman is Friendly,
> >survival is a given, unless you decide to commit
> >suicide.
> Or unless it thinks you're better off dead.

If it thinks you're better off dead, either 1), it is
for such a compelling reason that you agree and commit
suicide, or 2), the AI is unFriendly. Wouldn't you
call an AI that decided that someone should be dead
for no good reason unFriendly?

Do you Yahoo!?
Find out what made the Top Yahoo! Searches of 2003

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:43 MDT