From: Ben Goertzel (firstname.lastname@example.org)
Date: Wed Jun 26 2002 - 16:29:48 MDT
Now this conversation gets interesting...
James Higgins wrote:
> The committee is there for RISK MANAGEMENT. A task which should
> very much
> be done thoroughly on such a task as creating a Singularity. They do not
> have to, collectively, understand all the inner working of the
> design. They simply have to be convinced to a reasonable degree that the
> design, as a whole, is safe. There are many such examples of this in
> present day life, where an entity is responsible for ensuring safety. If
> it is impossible for a group of 10 intelligent people to agree that it is
> safe to launch a Singularity then, frankly, it shouldn't be launched.
James, I can see your heart's in the right place... but, I'm afraid you may
be overestimating the rationality of humans and human groups...
What if Bill Joy is on the committee? He's intelligent....
The Singularity is a highly emotionally charged issue. There may be humans
who are simply emotionally opposed to the Singularity, and unable to
rationally balance the risk of not going forward vs. the risk of going
Similarly, there of course will be humans who are emotionally in love with
the Singularity, and are so biased in this way that they have a hard time
rationally balancing the risk of going forward versus not going forward...
> > The committee
> >will pick out a set of Asimov Laws designed by Marvin Minsky in
> >with currently faddish AI principles.
Minsky is a tremendously smart guy -- he's a bit egomaniacal and cussed, but
he's also basically a techno-optimist without a strong anti-Singularity
bias. I have a great respect for him in pretty much all ways. I think he'd
be a great choice for such a committee.
My overall reaction to the "committee" idea is as follows.
1) I don't think a governmentally-appointed committee is likely to work, for
reasons similar to (but milder than) the ones Eliezer states. Such a
committee's membership would be formed by "politics as usual," it would end
up with a few Joy types on it as well as a few knee-jerk pro-Singularity
types and some moderates -- and as such it would never reach a consensus on
anything, ever, though it might lead to some interesting discussions.
2) I do think that the decision of whether to launch the Singularity is too
big for any one person, or any one dedicated research team. For one thing,
it's just a HUGE decision; for another thing, there will be a tendency for
any team to want to see their own AI "go all the way" -- an emotional factor
that will be hard to overcome for anyone, no matter how mature they are.
Because of factors 1) and 2), I think that whatever individual or group
reaches near-human-level-AI is going to have to take it upon themselves to
assemble an advisory board composed of individuals combining reasonable
technical expertise and reasonable general wisdom. If I were in this
position, I would certainly choose Minsky and Kurzweil, for example, but
probably not Joy (although I can't say for sure, because I don't know him,
and for all I know his real views might be milder than the ones he expressed
in his polemical article and associated interviews). I do know Jaron
Lanier, who holds Joy-ish views, and who I would *not* choose for my
committee, because in spite of his intelligence I feel he holds irrationally
"knee-jerk" anti-Singularity views. (And, yeah, Ok, I would choose Eliezer
I think that a broader discussion group should *also* be assembled,
involving the more articulate and rational of the rabid
anti-Singularitarians (Joy, Lanier, etc.) as well as pro-technology people.
This committee should be assembled in order to gather its opinions only,
without a view toward decision-making.
Government is not going to solve this problem. And I say this as someone
with fairly democratic-socialist tendencies, not as a typical extropian
libertarian. If government tries to manage the Singularity, it's just going
to drive real Singularity development work underground or overseas. The
problem has to be solved by maturity and responsibility on the part of the
people doing the development, I feel. This is scary, but to me, it's less
scary than thinking about the government handling something of such
importance.... While it's true that gov't is generally good at halting
action from occurring, by bogging it down in endless bureaucracy, gov't's
have also been responsible for a hell of a lot of fanatically unwise
actions -- governmental involvement is far from a prescription for wisdom!!
-- Ben G
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT