Re: Doubling All Efforts

From: Brian Atkins (brian@posthuman.com)
Date: Tue Sep 18 2001 - 15:46:00 MDT


What we need at this point is funding. We would like to be able to bring
on 4 or more new researchers within the next 6 months. If anyone here has
the capability to begin a sustained series of long term donations, or if
anyone here knows someone who could, then we would definitely urge you to
think seriously about working to get some funding sent our way. I agree
that recent events show that the sooner we can achieve a safe Singularity,
the better.

"Aikin, Robert" wrote:
>
> In light of the fact that religious/political instability has recently
> escalated, it has become clear to many that reasonable threats to the
> otherwise inevitable Singularity may soon appear in the so-called civilized
> world, namely, the United States. Possible (probable?) chemical,
> biological, and various nuclear attacks pose a danger not only to society
> and the economy, but to R&D of AI (Friendly, etc.). With this in mind, I
> would like to urge all those involved in the serious work of AI to please
> double all efforts.
>
> Best,
>
> RLA
> ++++++CONFIDENTIALITY NOTICE+++++
> The information in this email may be confidential and/or privileged. This
> email is intended to be reviewed by only the individual or organization
> named above. If you are not the intended recipient or an authorized
> representative of the intended recipient, you are hereby notified that any
> review, dissemination or copying of this email and its attachments, if any,
> or the information contained herein is prohibited. If you have received
> this email in error, please immediately notify the sender by return email
> and delete this email from your system. Thank You

-- 
Brian Atkins
Singularity Institute for Artificial Intelligence
http://www.intelligence.org/


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT