From: Brian Atkins (brian@posthuman.com)
Date: Sun Jun 30 2002 - 13:49:40 MDT
Ben Goertzel wrote:
>
> > Right, we need some way to compare between them all.
>
> And I don't see any way to make reasonably solid probability estimates about
> *any* of these risks... the risk of the human race bombing itself to
> oblivion OR the risk of a certain apparently friendly Ai going rogue...
>
Does anyone know of any specific attempts to nail down probabilities
of various existential risks? I guess we have some rough estimates for
asteroid impacts... could we use the fact that apparently if you give
a few thousand (supposedly ethical) army scientists access to anthrax
that the odds are apparently high one of them will abuse their tech?
I guess that doesn't bode well for making any kind of advanced bio or
nanotech available to large numbers of scientists.
-- Brian Atkins Singularity Institute for Artificial Intelligence http://www.intelligence.org/
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT