RE: How Kurzweil lost the Singularity

From: Ben Goertzel (ben@goertzel.org)
Date: Sat Jun 15 2002 - 11:31:11 MDT


hey Eliezer,

FIRSTLY, I'm about to leave for a week-long camping trip, so I'll be offline
for the next week, starting in a few hours.... Thus I may not read your
reply to my e-mail for a week ;>

Now... I think you oversimplify things a little bit.

But ultimately, which one of us has better inferred Kurzweil's inner psyche,
is not very important, huh?? ;>

> On every occasion in which I have spoken to
> Kurzweil, the concept of influencing the Singularity in any way
> is met with
> blank incomprehension.

I don't think so. My impression from talking to him is that he considers it
possible for individuals to affect the course of the Singularity. I think
he just takes a kind of "grand historical" perspective, rather than an
individually-focused perspective.

To take an example from another domain, in the grand historical perspective,
one might say "the emergence of quantum physics was inevitable." And in a
way it *was* inevitable, regardless of whether the particular individuals
Heisenberg, Schrodinger, Planck, etc. took up physics or gardening. On the
other hand, it was important that SOME individuals did the actual physics!
And it took some pretty smart individuals...

Similarly, from the grand historical perspective, it doesn't matter what you
or I personally do for the Singularity, it's gonna happen. Yet, it's
important that SOMEBODY does the things we're doing...

The problem is that this grand historical perspective is a little more
applicable to the emergence of a scientific theory, than to the Singularity.
Because the Singularity may have a much more "sensitive dependence on
initial conditions"....

Sure, if Wallace instead of Darwin had been seen as the main champion of
evolutionary theory, Neo-Darwinism might be a better theory today (more
focus on cybernetics). And if Einstein had not existed, general relativity
might have emerged only much later, so that the GUT's of today would be even
more strongly field-theory-focused rather than gravity-focused. So there is
*some* dependence on initial conditions in the development of a scientific
theory.

But with the Singularity, a different "twist" in the launch conditions could
lead to unrecoverable disaster.... This is less likely to happen in the
evolution of science. Hence the "grand historical perspective" that
Kurzweil favors, is more problematic in regard to the Singularity than in
regard to most other historical phenomena...

> As far as Kurzweil is concerned, he wins the
> argument when he convinces the audience that the Singularity will happen.

Sure, but his choice of what argument to have with the public right now,
does not tell you what his whole world-view is...

He's doing PR, and he knows how to do it better than either of us does.

> Kurzweil wants to believe in the benevolence and inevitability of the
> Singularity and any argument of the form "You can do X and it will improve
> your chances of (a Singularity) / (a positive Singularity)" appears to him
> to be a vulnerability in his argument: "The Singularity *could*
> (go wrong)
> / (not happen) if not-X." Kurzweil will therefore argue against it.
> Kurzweil's entire worldview prohibits the possibility of
> Singularity activism.

I think this is rather an overstatement.... I don't think you're fully
appreciating the nature of the "grand historical perspective."

Saying that "the emergence of X is inevitable as a consequence of countless
human actions" is not implying that "these countless human actions are
unimportant."

I think that *part* of what gets your goose about Kurzweil is that he
doesn't value YOUR actions any more highly than those of 100000 other
scientists and engineers working on generally Singularity-focused advanced
technology ;>

> In fact, having watched Kurzweil debate Vinge, I've come to the conclusion
> that Kurzweil's worldview prohibits Kurzweil from arriving at any real
> understanding of the basic nature of the Singularity. Over the
> course of my
> personal interaction with Kurzweil, I've seen him say two really bizarre
> things. One was during the recent chat with Vinge, when Kurzweil
> predicted
> superhuman AI intelligence in 2029, followed shortly thereafter by the
> statement that the Singularity "would not begin to tear the
> fabric of human
> understanding until 2040".

He has a different estimate of the growth curve of intelligence in the
near-superhuman realm than you do.

He understands the idea of exponential intelligence increase thru AI
self-modification, he just thinks the exponent will be smaller than you
think it will be.

I think he's overpessimistic and you're overoptimistic in this particular
regard, but we're all just grabbing exponents out of our asses here,
basically...

> The second really bizarre thing I've heard
> Kurzweil say was at his SIG at the recent Foresight Gathering,
> when I asked
> why AIs thinking at million-to-one speeds wouldn't speed up the
> development
> of technology, and he said "Well, that's another reason to expect Moore's
> Law to remain on course."

I don't get this one... sounds like a miscommunication...

> What
> Kurzweil calls the "Singularity" is the inevitable, inexorable,
> and entirely
> ordinary progress of technology, which, in Kurzweil's world, *causes*
> developments such as transhumanity, but is not *changed* by transhumanity
> except in the same ways that industry has been changed by previous
> technological developments.

I do not think his understanding is this shallow, though I admit he may not
have thought through the dramatic implications of the Singularity as
thoroughly as some of us

> What Kurzweil is selling, under the brand name of the
> "Singularity", is the
> idea that technological progress will continue to go on exactly as it has
> done over the last century, and that the inexorable grinding of
> the gears of
> industry will eventually churn out luxuries such as superintelligent AIs,
> brain-computer interfaces, inloading, uploading, transhuman
> servants, and so
> on.

I think he's saying a bit more than that! I too am a bit disappointed by
his choice of emphasis in his Singularity writings -- but still, I don't
think his writings are as bad as you imply.

It is true, he does not adequately focus on the fact that, post-Singularity,
we're going to be in a TOTALLY UNKNOWN region, in which reality and
experience as we now know it MAY become totally irrelevant. I think he does
understand this, to some extent, but chooses not to focus on it.

On the other hand, I think you tend to downplay the possibility that *limits
to intelligence and progress* might be discovered, which we do not now
understand or suspect.... I hope such limits are not found (or if they're
found, they're not severe), but the possibility of such limits is part and
parcel of accepting that we are moving into a totally unknown region!

-- Ben G



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT