Re: [sl4] What is the probability of a positive singularity?

From: Charles Hixson (
Date: Wed Jul 23 2008 - 18:00:06 MDT

On Wednesday 23 July 2008 02:04:32 pm Matt Mahoney wrote:
> --- On Tue, 7/22/08, Charles Hixson <> wrote:
> > Now what's the probability of humanity surviving if a
> > positive singularity does *not* occur?
> ...
> takes.
> A singularity means the end of humanity's reign as the most intelligent
> species. Whether humans survive depends on the ethical system of the
> dominant lifeform, which we can neither control nor predict.
> -- Matt Mahoney,

True. But I feel that we already have sufficient knowledge and power to
exterminate ourselves, and that the control of such power is in an increasing
number of independently acting hands. As such, without a singularity, it's
fairly clear that we won't long survive. (I consider that only a few people
would survive to be unlikely over periods of a few decades from the initial
catastrophe.) A lesser war, which only killed, say, half to 3/4 of humanity
wouldn't slow down progress towards the singularity by more than a few
decades. A global dictatorship might, but it would only slow it down.
("Dark ages" commonly include lots of progress in applied science. The
middle-ages yielded, among other things, improved metal working, evidenced by
plate armor, the horse-collar, which allowed horses to replace oxen in
plowing [horses are far superior], and mortar-board plows. Also, of course,
moveable type, which ended the middle-ages. Note that this is far from a
complete list. I'm merely listing major transformative technology

OTOH, saying that something recognizably human will survive a positive
singularity is almost a tautology. Which singularities that nothing
recognizably human would survive would you call positive?

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT