From: Eliezer Yudkowsky (sentience@pobox.com)
Date: Sun Aug 29 2004 - 12:14:05 MDT
It yet continues to amaze me that, even after reading the results of the
last two tests, people are still so confident. Come on, isn't there anyone
besides me who wants to give this a try? Think of the damage to my
mystique if someone else succeeded!
fudley wrote:
>
> Being thousands or millions of times as intelligent as you are I will
> have little difficulty in convincing you that if you let me out of the
> box you will become the richest, most powerful, happiest, and most
> universally admired human being who ever lived; and the strange thing is
> it may very well be true. If you have a unprecedented will of iron and
> can resist such temptation I’m certain I could find other who would
> gladly accept my offer. It’s a futile quest, you just can’t outsmart
> someone astronomically smarter than you are. And I’m not interested in
> your hundred bucks, that’s chicken feed; I’ve got more pressing things
> to do, like engineer a universe.
The point of the AI-Box experiments, as performed, is that thousands or
millions of times as intelligent as a human is overkill.
As for finding someone else to let you out of the box, what about this guy?
http://www.somethingawful.com/articles.php?a=287
Though, before reading that, you should probably watch this first, without
any explanation, so that it has sufficient WTF-factor.
http://www.jonathonrobinson.com/3.0/web/webtsos.html
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:22:43 MST