From: ben goertzel (firstname.lastname@example.org)
Date: Wed May 22 2002 - 10:21:47 MDT
But the Singularity is a double-edged sword. Some of the potential outcomes
of a Singularity might just as well be the obliteration of the human race.
(This would in fact end war forever, but it was probably not what you had
Actually it probably would NOT end war forever. There could still be wars
among advanced AI minds, or in time among superevolved ubercockroaches...
We humans are violent and nasty at times, but are not actually the SOLE
repository of violence and nastiness in the cosmos ;>
Yes, I agree. If no spectacular breakthrough comes about, conservative
people would probably no think it possible at all.
And even if a spectacular breakthrough occurs, it will quite likely not be
The step from an almost-human-level AI to a self-modifying AI with
exponentially increasing intelligence rapidly becoming superhuman, is
probably a step that most people won't be willing to take even when they
see the almost-human-level Ai ...
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT