From: Eliezer S. Yudkowsky (email@example.com)
Date: Wed May 08 2002 - 00:07:59 MDT
Ben Goertzel wrote:
> Meanwhile, Y has been thinking about the mountain for perhaps half as long,
> or a little less. Instead of leading expeditions up the mountain, however,
> he prefers to sit at the base, and write treatises about the philosophy of
> mountain climbing, climbing-equipment design, and the nature of the
> spaceship at the top. Each time someone attempts an ascent, he nods
> knowingly and with an aura of wisdom beyond his years, "You will fail to
> reach the spaceship. You have underestimated the difficulty of the task.
> This is no mere ordinary mountain. This is the mountain with the Great
> Spaceship at the top!"
I am trying to gather an expedition, please note. While waiting to collect
the funding for the expedition, I am doing as much advance work as possible
at the base to prepare for the ascent. Occasionally someone comes up and
says cheerily: "What's with all this preparation? Why haven't you gotten
started yet? This mountain looks quite easy; you ought to be able to climb
it on your own. Why, I'm setting out on this trip with nothing but a Kevlar
tent and a backpack of X-Men comics." And *that* is the point when I launch
into my the-mountain-is-bigger-than-that speech.
The total amount of time we have respectively been thinking about the
mountain is as irrelevant as our respective degrees of commitment; if you
think that your total time expended has resulted in a better theory, then
you should be able to win simply by defending the better theory without
needing to refer to time spent.
If the people climbing the mountain have managed to cast a shadow over the
entire mountain-climbing field through repeated failures to reach the top
*after cheerily predicting* that the top was within reach, then not all
those who attempt to reach the top are performing a public service by doing
Finally, even without deconstructing the rest of the parable, you can flip
its moral lesson quite easily by assuming that the mountain *is* much larger
than Y thinks. And historically this would appear to be the heuristic
lesson of AI.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Thu May 23 2013 - 04:00:30 MDT