From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri Nov 25 2005 - 09:40:55 MST
Michael Vassar wrote:
>
> Anyway, Mark Geddes, since you wonder how Eliezer would have done on
> that puzzle, I'll tell you exactly how he would have done.
The following is quite an interesting opinion, and it is, of course,
your own - why preface it by attributing it to me?
> He would
> have said something like "there is a minute chance that someone working
> on such a puzzle is smart enough that they could contribute to a seed
> AI, but no chance at all that someone working on such a puzzle rather
> than on something useful has the minimal level of ethical seriousness
> that must be demanded from a seed AI researcher so it isn't worth
> recruiting here." As a serious seed AI researcher, the possibility that
> it might be a good idea to spend his time working the puzzle would never
> even have occurred to him. People pay attention to Eliezer not because
> he's an ex-prodigy and supposed "super-genius", but because he has spent
> his time effectively to develop certain ideas with more clarity than
> anyone else has. Your ability to make accurate one sentence guesses,
> without the ability to explain your reasoning, is just not interesting,
> even if it does result in your scoring higher on some IQ tests than
> Eliezer or anyone else would. Maybe the puzzle you were talking about
> was "the ultimate IQ test" but that is precisely why it is NOT similar
> to FAI theory. Unlike the answers on an IQ test, "AI doesn't fit on a
> t-shirt", and for FAI, which can't be solved with guess work, trial, and
> error, even correct intuitive answers which cannot be explained are
> little better than worthless.
I'd never tell a prospective seed AI programmer to stop playing
mathematical games, though I don't know if the book Geddes referenced
was a game of that type. I was first captured into this gig - it seems
inevitable in hindsight, yet that itself may be illusion - when I pulled
Vernor Vinge's "True Names and Other Dangers" randomly off a library
shelf of SF paperbacks. Several important ideas came to me while I was
pursuing my time-wasting hobby of trying unsuccessfully to write science
fiction. Inspiration lurks in odd places.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:53 MDT