From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Jan 08 2004 - 17:26:44 MST
Ben Goertzel wrote:
>
> -- a seed AI, triggered to awaken in N years and start evolving toward
> friendly superintelligence
> -- a simulated world for the seed AI to play in
> -- a universe of virtual humans embodied in the simulated world
>
> Statistically, some percentage of these AI's will become friendly
And statistically, some number of times I put my hand on my desk, my hand
will tunnel through the potential energy barrier. That there is a
statistical chance of something doesn't mean you can afford enough
instances to raise the total probability to something significantly
different from zero - not without the exercise of the same skills that
would be required to make a single instance work.
Let's send out enough monkeys, maybe one of them will type Shakespeare.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:43 MDT