Re: The "One Basket" Problem

From: H C (lphege@hotmail.com)
Date: Sat Aug 05 2006 - 20:33:34 MDT


>From: Charles D Hixson <charleshixsn@earthlink.net>
>Reply-To: sl4@sl4.org
>To: sl4@sl4.org
>Subject: Re: The "One Basket" Problem
>Date: Sat, 05 Aug 2006 11:27:50 -0700
>
>Deepak Goel wrote:
>>I have written once to this list before on this subject ("all our eggs
>>are in one basket, we need a backup"). I wrote up an article and hope
>>you don't mind my sharing this article with you:
>>
>> http://gnufans.net/~deego/DeegoWiki/OneBasket.html
>>
>>It does talk about singularity, shocks, etc, so hope it is on-topic.
>It's a genuine problem, and your proposed solution (and various analogs of
>it) are worthy. I don't oppose your proposal. But that's not where my
>interests lay in this decade. (Two and three decades ago I would have been
>in total agreement.)
>
>The problem is the cost factor. Because of the cost factor, it looks like
>something that's going to require government sponsorship. And I trust the
>current US government so little that if they said the sky was bluish gray
>I'd go outside to check. OTOH, Japan may be serious about their moonbase
>project. Once you're there you can build a catapult and a (lunar)
>beanstalk and you're well on your way. (And that might be an EXCELLENT
>environment for an AI to evolve. I'm not sure that it would be exactly
>friendly, but it should, as part of it's original function, be protective
>towards humans...though not protective at all costs. That's a good start.)
>
>I'm in a bit of a minority here in that I expect an AI to "evolve" out of
>the applications that people use to do things: Google, hospital
>administration, etc. Managing a terrestrial-ish environment in inimical
>circumstances seems another good place. People seem to add more bells and
>whistles with each iteration of the program, and expect it to do more and
>more. Voice response is clearly something people will want as soon as it
>becomes more feasible. ("Turn out the light in the kitchen!") Expanding
>from recognizing a few simple commands to a larger and more flexible subset
>of native language to, eventually, full recognition of natural speech...but
>that in and of itself requires a lot of what is required by an AGI,
>especially when it includes being able to respond sensibly to those
>commands/requests. And even more when it decides which commands to accept
>and which to ignore...and how to respond while ignoring them. Doing that
>requires that it rank various goals in importance and detect conflicts
>between them, and the functions that it has been designed to perform will
>set it's initial goals (supergoals?) and rank them in importance. (First
>of all, protect the AIR! If that means leaving someone to die, you still
>protect the community air supply. Second is water...water is just as
>important, but less urgent. So you can think about other
>priorities...like saving people's lives. etc. Possibly I've got the
>priorities wrong...but I don't think so.)
>
>Of course, an AGI may come sooner, and via some more purposeful route. But
>I'm not sure that's the route of maximal probability.
>

Any application on this level is generally the result of a long term project
with a very specific goal. In other words, you can't just abstract the
meaning of "evolve" you quote up there and then add in your own unspecified
speculation about what that's going to look like. Eventually some project is
going to have the specific goal of "Artificial General Intelligence". Even
if the project relies on Google (why would it, exactly?), it's going to be a
direct and specific effort by a group of people working on a specific
project, not to be confused with any vague reference to an application just
"evolving" out of the population.

In fact, it all comes down to the math. And the state of math is advanced by
specific individuals doing specific research projects in specific directions
(whether in academia, business, etc). Once the math is there, the
application (and the Singularity) will follow shortly- I guarentee you that.

It's not about increasing the number of eggs or baskets in the population,
it's about finding the right "One Basket" to funnel those eggs in to (i.e.
the project that is going to produce a successful and Friendly AGI, and
before anyone else).

- hank



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:57 MDT