From: Ben Goertzel (firstname.lastname@example.org)
Date: Sat May 03 2003 - 09:02:17 MDT
> it seems that it'd be nice if we could
> help out,
> instead of just waiting for Eliezer to make the singularity a reality by
> himself. Any suggestions?
> Mike W.
Eliezer is the best one to speak to this point, but I'll open my mouth
anyway because your statement implicitly relates to my own work as well.
It seems to me that Eliezer is not at all trying to make the Singularity a
reality by himself. Rather, he seems to be focusing his efforts on trying
to figure out how to make the Singularity, when it happens, be a benevolent
one for humans and other sentients generally.
There are many others (on and off this list) who are devoting more time than
Eliezer to trying to make the Singularity happen sooner (e.g. Peter Voss and
myself, who are directing AI projects that are explicitly focused on
eventually achieving world-transformingly high levels of computational
intelligence). So far as I know, no one is devoting more time than Eliezer
to thinking about how to make sure that, when a seed AI launches the
Singularity, the Singularity in question is a "good" one.
-- Ben G
This archive was generated by hypermail 2.1.5 : Sat May 18 2013 - 04:00:34 MDT