From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sat Jun 22 2002 - 04:26:27 MDT
Smigrodzki, Rafal wrote:
> Eliezer S. Yudkowsky wrote:
>
> Vernor Vinge's original stories of
>> the Singularity were based around the blazing disruptive power of
>> smarter-than-human intelligence
>
> ### You build your business plan on the idea of hard-takeoff Singularity.
> I agree this is a prudent decision - the "slow" variant much less
> dangerous, and would involve a huge number of contributors, with
> potentially less opportunity for a small group to make a difference. It
> is reasonable to concentrate on the most scary scenario, even if it isn't
> the most likely one.
I do not consider a soft Singularity to be any less scary than a hard
Singularity. I think this is wishful thinking. A Singularity is a
Singularity; the Singularity doesn't come in a "soft" version that lets you
go on a few dates before deciding on a commitment. That option might be
open to individual humans (or not), but it is not a real option for humanity.
> While I also believe that the fast Singularity is a distinct possibility,
> I don't have an intuition which would help me decide which variant is
> more likely and by what odds.
>
> What is your intuition? Is it a toss-up, a good hunch, or (almost)dead
> certainty?
I would call it dead certain in favor of a hard takeoff, unless all the
intelligences at the core of that hard takeoff unanimously decide otherwise.
All economic, computational, and, as far as I can tell, moral indicators
point straight toward a hard takeoff. The Singularity involves an inherent
positive feedback loop; smart minds produce smarter minds which produce
still smarter minds and so on. Furthermore, thought itself is likely to
fall through to much faster substrate than our 200Hz neurons. The closest
we might come to a slow Singularity is if the first transhumans are pure
biological humans, in which case it might take a few years for them to build
AI, brain-computer interfaces, or computer-mediated broadband telepathy with
64-node clustered humans, but my guess is that the first transhumans would
head for more powerful Singularity technologies straight out of the gate.
Beyond that point it would be a hard takeoff. I see no moral reason for
slowing this down while people are dying.
I would say that the Singularity "wants" to be hard; the difficulty of
keeping it soft would increase asymptotically the farther you got, and I see
very little point in trying.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT