From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri May 03 2002 - 14:28:29 MDT
Max Comess wrote:
>
> It's not really about a website itself becoming self aware. It's about
> making those who use it aware. That is the whole point of IA. Friendliness
> and morality would be designed into ultimately resulting Real AI's because
> these principles would become a part of the collective intelligence of this
> website, which would of course include the Singularity Institute and other
> institutions like it. I just want to see all these different groups' effort
> being channeled in the most effective way possible, in the most direct way
> possible, toward the singularity. If Pflop level computers exist and IA
> exists then I think that we can, and should, finally get off our collective
> butts and make it happen as soon as possible.
I don't know how to create Friendliness in a planetary-sized soup of
self-modifying heuristics, and I don't have confidence in my learning how to
do so at a future date.
And others aren't going to fall into line just because you issue a wake-up
call. You have to get off your own personal butt in order to put in the
lifetime of work it takes to get other people to get off their collective
butts.
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT