From: Samantha Atkins (email@example.com)
Date: Wed Sep 19 2001 - 01:06:05 MDT
"Aikin, Robert" <firstname.lastname@example.org> writes:
> In light of the fact that religious/political instability has recently
> escalated, it has become clear to many that reasonable threats to the
> otherwise inevitable Singularity may soon appear in the so-called civilized
> world, namely, the United States. Possible (probable?) chemical,
> biological, and various nuclear attacks pose a danger not only to society
> and the economy, but to R&D of AI (Friendly, etc.). With this in mind, I
> would like to urge all those involved in the serious work of AI to please
> double all efforts.
I would like to second that. It would also be a very good idea, to the extent
practical, to back up and distribute your work and distribute your facilities
as much as possible. Some might want to consider living in areas that
look a bit less like possible targets.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT