Re: Strong AI Takeoff Scenarios

From: Daniel Burfoot (
Date: Sun Sep 16 2007 - 20:04:37 MDT

On 9/17/07, Rolf Nelson <> wrote:
> 2. Giant disaster or war, say on a scale that kills off more than 20%
> of humanity in a single year.

I don't think it would require such a huge disaster. It may be that the
conditions under which technological advance and economic development like
we've seen for the last 300 years or so are relatively fragile. Imagine a
major economic meltdown, which reduces most of humanity to substinence

Say terrorists get their hands on a nuke and blow up New York. Global chaos
results, setting off a chain reaction of governments and financial
institutions defaulting on debt. The economic system grinds to a halt,
millions starve, etc etc.

Also, note that history has shown how civilization can advance but also
recede. After the Roman Empire fell, there was no comparably advanced
civilization (in Europe) for more than 1000 years.


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:58 MDT