Re: Egan dismisses Singularity, maybe

From: Evan Reese (
Date: Mon Apr 29 2002 - 05:44:13 MDT

----- Original Message -----
From: "mike99" <>
Subject: RE: Egan dismisses Singularity, maybe

> Right, Evan, that is the issue. Even the governing AI's that are the
> for the upload "polis" domains in DIASPORA are not qualitatively different
> in intelligence from the upload-minds inhabiting those polises (at least,
> far as I can tell).
> If Egan believes what his characters say in his latest book (as quoted by
> Damien B.) then apparently Egan views general intelligence as a process
> can be speeded up and given more data processing resources without
> qualitatively different. In this view, there will *not* be any emergent
> phenomena from faster thought processes that have denser data resources
> which could accurately by termed "superintelligence."
> No superintelligence, no Singularity.
> I certainly disagree with that view. It seems to me that Nature shows many
> examples of complex systems exhibiting new behaviors that one would not
> expect by a simple analysis of their component parts/subsystems. I believe
> this will probably also be true of superhuman levels of intelligence.

Unfortunately, Egan seems unwilling - or unable? - to make a real break with
the known. Take a look at the end of _Quarantine_ for a good example. He
had everything all set up for a transition into something radically
different, and SOMEHOW the whole thing collapses. I was majorly
disappointed. If anyone could depict really radical change, I would think
it would be Egan; but even in his seemingly most imaginative work, he holds

This isn't just a problem for Egan. Benford's _Sailing Bright Eternity_ is
the most disappointing book I have ever read. He not only had everything
all set up for a transhuman emergence, he even promised it in the form of
saying that Mech/human cooperation was necessary for their survival in the
far future. (True, he didn't actually mention a merger of the two, but he
does tell us that the so-called Highers evolved from Mechs and Naturals.
But why no such evolution for humans?) He even broke his own word in the
process of destroying the series: In his afterword to _Artifact_ he says
that he is no longer going to write - or interested in writing about?, I
forget exactly, but it doesn't matter for my point - a 'cozy cosmos'. I
remember that phrase well enough. But at the end of SBE, he has the Highers
settling the humans in 'a comfortable lane in the esty'. I guess when the
rubber hit the road, he wasn't up to it.

In Clarke's _Profiles of the Future_, he cites 'failure of imagination', and
'failure of nerve' as the two reasons why people so often underestimate the
future. I really have trouble ascribing failure of imagination to either of
these authors. I think the real trouble is that writers who consider
themselves 'hard sf' writers just don't deal with transhumanity or
singularities. Perhaps, in their attitude that's too close to mysticism
than to hard science. Vinge makes this point explicitly in _marooned_ when
Della is telling Will about her view of the Singularity. At least in the
case of Benford, anyhow; I don't know whether Egan considers himself one of
these or not. So then, if you consider yourself a hard SF writer, you gotta
go through contortions - or just ignore your own previous statements - to
explain why the far future looks so little different from our present. Sure
it's difficult to describe superintelligence, or any other kind of break
with the current state; but people like Bear, Vinge and Clarke at least take
their shots at it, acknowledging that truly radical breaks with the known
are not only possible, but making the simple assumption that since they have
happened in the past, they are probably going to happen in the future. As
time goes by, the attempts at avoiding these breaks look increasingly
unrealistic, and as in Benford's case at least, truly absurd.

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT