From: xgl (xli03@emory.edu)
Date: Sat Sep 16 2000 - 23:19:22 MDT
On Sat, 16 Sep 2000, Josh Yotty wrote:
> Nuke the sucker. Make sure it isn't mobile. Riddle it with bullets.
> Melt it. EVAPORATE IT. Unleash nanobots. Include a remote fail-safe
> shutoff the AI can't modify. Don't give it access to nanotech. Make
> it human (or upgraded human) dependant in some way so it doesn't
> eradicate us.
>
so, we make a seed, and then we kill it -- what was the point of
this exercise again? or do you mean to kill it only _in case_ it goes
haywire? a seed smart enough to eradicate us is also smart enough to gain
our trust; and when it makes its move, we won't know what hit us. the seed
doesn't have a patience problem (one of the big stumbling blocks for all
those bad guys in the movies), we do. in short, if the seed wants our
asses, the seed has our asses.
-x
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT