RE: How hard a Singularity?

From: James Higgins (jameshiggins@earthlink.net)
Date: Wed Jun 26 2002 - 16:51:32 MDT


At 04:29 PM 6/26/2002 -0600, Ben Goertzel wrote:
>Now this conversation gets interesting...

And its making progress...

>James Higgins wrote:
> > The committee is there for RISK MANAGEMENT. A task which should
> > very much
> > be done thoroughly on such a task as creating a Singularity. They do not
> > have to, collectively, understand all the inner working of the
> > design. They simply have to be convinced to a reasonable degree that the
> > design, as a whole, is safe. There are many such examples of this in
> > present day life, where an entity is responsible for ensuring safety. If
> > it is impossible for a group of 10 intelligent people to agree that it is
> > safe to launch a Singularity then, frankly, it shouldn't be launched.
>
>James, I can see your heart's in the right place... but, I'm afraid you may
>be overestimating the rationality of humans and human groups...

Maybe so. [Added after completing the remainder of the email: Don't think
so, I think we are at least in the same ballpark]

>What if Bill Joy is on the committee? He's intelligent....

Obviously the members of the committee would be chosen very carefully. I
don't suggest that it be a public event or that just any person may
apply. Eliezer and yourself would be obvious choices. Actually, I think
if the two of you choose the remaining members I'd be ok with that. As
long as neither of you tried to load the deck and the committee didn't end
up filled mostly with people from your two projects, that is.

>The Singularity is a highly emotionally charged issue. There may be humans
>who are simply emotionally opposed to the Singularity, and unable to
>rationally balance the risk of not going forward vs. the risk of going
>forward.

Ah, which is why I said that members of the committee would: 1) have to
WANT (very much) to see the Singularity occur.

> > > The committee
> > >will pick out a set of Asimov Laws designed by Marvin Minsky in
> > accordance
> > >with currently faddish AI principles.
>
>Minsky is a tremendously smart guy -- he's a bit egomaniacal and cussed, but
>he's also basically a techno-optimist without a strong anti-Singularity
>bias. I have a great respect for him in pretty much all ways. I think he'd
>be a great choice for such a committee.

I've heard a bit a bout Minsky from another individual I know. So start
the committee off with the three of you. New members would be added by a
2/3 vote of you three. With a minimum desired membership of at least 5 I'd
say. Maximum to be decided by you three, but I'd say 25 would be way too high.

>My overall reaction to the "committee" idea is as follows.
>
>1) I don't think a governmentally-appointed committee is likely to work, for
>reasons similar to (but milder than) the ones Eliezer states. Such a
>committee's membership would be formed by "politics as usual," it would end
>up with a few Joy types on it as well as a few knee-jerk pro-Singularity
>types and some moderates -- and as such it would never reach a consensus on
>anything, ever, though it might lead to some interesting discussions.

For the record, I NEVER thought a government, or even industry, formed
committee would or could be effective. The committee would have to be
assembled carefully by people who believe strongly in the Singularity, are
intelligent and we know.

>2) I do think that the decision of whether to launch the Singularity is too
>big for any one person, or any one dedicated research team. For one thing,
>it's just a HUGE decision; for another thing, there will be a tendency for
>any team to want to see their own AI "go all the way" -- an emotional factor
>that will be hard to overcome for anyone, no matter how mature they are.

Which is why, for such a significant event in human history, they should
not be the ones to decide.

>Because of factors 1) and 2), I think that whatever individual or group
>reaches near-human-level-AI is going to have to take it upon themselves to
>assemble an advisory board composed of individuals combining reasonable
>technical expertise and reasonable general wisdom. If I were in this
>position, I would certainly choose Minsky and Kurzweil, for example, but
>probably not Joy (although I can't say for sure, because I don't know him,
>and for all I know his real views might be milder than the ones he expressed
>in his polemical article and associated interviews). I do know Jaron
>Lanier, who holds Joy-ish views, and who I would *not* choose for my
>committee, because in spite of his intelligence I feel he holds irrationally
>"knee-jerk" anti-Singularity views. (And, yeah, Ok, I would choose Eliezer
>as well!)

It would be better on the whole to have a single committee if at all
possible. And, yes, I do understand the complications and reasons why this
is unlikely to happen. But that doesn't mean we should not try. If such a
committee were formed I would not have any issues with it being attached to
an entity such as the Singularity Institue, as long as most of its members
weren't answerable to or involved in their development work.

Such a committee should be formed and at least made available for other
projects to utilize.

>I think that a broader discussion group should *also* be assembled,
>involving the more articulate and rational of the rabid
>anti-Singularitarians (Joy, Lanier, etc.) as well as pro-technology people.
>This committee should be assembled in order to gather its opinions only,
>without a view toward decision-making.

An excellent idea.

>Government is not going to solve this problem. And I say this as someone
>with fairly democratic-socialist tendencies, not as a typical extropian
>libertarian. If government tries to manage the Singularity, it's just going
>to drive real Singularity development work underground or overseas. The
>problem has to be solved by maturity and responsibility on the part of the
>people doing the development, I feel. This is scary, but to me, it's less
>scary than thinking about the government handling something of such
>importance.... While it's true that gov't is generally good at halting
>action from occurring, by bogging it down in endless bureaucracy, gov't's
>have also been responsible for a hell of a lot of fanatically unwise
>actions -- governmental involvement is far from a prescription for wisdom!!

I don't believe there is ANY POSSIBILITY that government could positively
contribute to this problem, much less solve it. I don't even want them to
try, even by providing grants (since money always has strings attached).

James Higgins



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT