From: Olie L (neomorphy@hotmail.com)
Date: Mon Dec 12 2005 - 18:34:09 MST
The Wikipedia list is a useful primer...
One thing tho- The wiki article calls the risk of impacts from space 
"meteorites".  I had thought that the general term for large objects 
impacting the earth was "boloid"s.  It would be inappropriate to call a 
small planet hitting another planet a "meteorite" - it would be a boloid.  
However, it seems that this term is neither in the dictionary, nor on 
particularly many webpages, so it's not a common use term.  Any astro 
experts?
...
Because (1) It might be useful for certain futurology purposes, and even 
planning;  and (2) I Have something of a fondness for contemplating disaster 
scenarios;  I would be interested to see a list of possible events that 
would have a significant impact on society, with one key measure being AI 
development.
Furthermore, I would be interested to put together some informed opinions 
about the probability of  some disaster scenarios.  If the predictions are 
any good, they might be useful for developing policies, and whether the risk 
of pushing one tech might be offset the the benefits of averting a different 
risk- see the discussions on:
Re: [sl4] Singularity, "happiness", suffering, Mars
from back in Late September '05
Note that things don't have to be existential risks to be globally 
problematic and singularity relevant...  For instance, although economic 
stagnation is clearly not an existential risk, if there is enough economic 
stagnation, it could seriously interrupt AI development and consequently FAI 
singularity development, and consequently reduce our ability to address real 
existential risks.
Another thing: although regional risks won't stop humanity, they are serious 
concerns for AI development.  F'rinstance, if a supervolcano - particularly 
Yosemite - goes off, the US economy is rooted.  The chances of this 
happening are, what, in the order of 3E-06 per year (once every three 
hundred thousand years or so)?  Now, a supervolcano won't be a big issue for 
the whole of humanity, but if the US economy is kaput, it's going to wreck 
the global economy, and put a tangible dampener on AI development.  How 
much?  Well, just say it put FAI development back 20 years (blind 
conjecture), that's 20 extra years of risk of another catastrophy that 
could... you get the picture.
-- Olie
>From: BillK <pharos@gmail.com>
>Reply-To: sl4@sl4.org
>To: sl4@sl4.org
>Subject: Re: List of envisioned global catastrophic risks
>Date: Mon, 12 Dec 2005 09:59:59 +0000
>
>On 12/12/05, Tyler Emerson wrote:
> >
> > I recall seeing a long list of envisioned GCRs. Anyone know the URL?
> >
>
>You might be thinking of
><http://en.wikipedia.org/wiki/End_of_civilization>
>
>BillK
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:54 MDT