From: Brian Atkins (brian@posthuman.com)
Date: Tue Jun 19 2001 - 18:26:46 MDT
"Christian L." wrote:
>
> Just out of curiosity: has the SIAI discussed these matters? Worst case
> scenarios?
>
There has been a little bit of discussion, but this really isn't our
area other than simply making sure of our own safety... we try to
support other organizations like ExI and now Pro-Act which was announced
at Extro 5.
As for the Singularity, I hold that it is inevitable except in the
case where humanity wipes all life completely out. For starters, Kurzweil
made a nice point in his Friday night talk showing how little the
effect was of the Great Depression and WW2. Secondly, even if the luddites
were to somehow take us back to the dark ages we would eventually work
our way back up to this level again... it is the nature of intelligent
beings to drive the Singularity. So in the long term sense, I do see it
as the inevitable result of evolution and intelligence.
But of course we want to get there sooner rather than later.
-- Brian Atkins Director, Singularity Institute for Artificial Intelligence http://www.intelligence.org/
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT