From: Charles Hixson (charleshixsn@earthlink.net)
Date: Tue Jul 22 2008 - 15:44:35 MDT
On Monday 21 July 2008 04:20:23 pm Scott Dickey wrote:
> A question for all,
>
> What is your current estimate of the probability that humanity will survive
> to see a positive singularity?
>
> Where "survive" means to avoid existential threats as outlined by Nick
> Bostrom (1), and where "positive singularity" is the ideal espoused by SIAI
> (2)(3).
>
> -Scott
>
> (1) http://www.nickbostrom.com/existential/risks.html
> (2) http://www.intelligence.org/overview/whatisthesingularity
> (3) http://www.intelligence.org/overview/whyworktowardthesingularity
Now what's the probability of humanity surviving if a positive singularity
does *not* occur? There's already been documented one case where we were
within 30 seconds of full-scale thermonuclear war. I doubt that humanity
would survive that. Even without a singularity that number of people who
have their finger on one trigger or another has been increasing. More slowly
than I expected, I admit, but still increasing.
I rate the probability of humanity surviving in it's current form as extremely
close to zero. It requires an extremely destructive war that wipes out
civilization without quite killing off all people. Of the paths through the
future that I see as continuing to contain something that could reasonably be
called civilized humanity, they all lead through a positive singularity.
Nearly all of them include uploads and most of the include aggressive
intelligence amplification. (In the ones that don't include either, people
are merely pets. Few of those continue to contain people in the long run.)
For most of these projected futures I can't assign even a guesstimated
probability.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT