From: Russell Wallace (russell.wallace@gmail.com)
Date: Tue Feb 07 2006 - 23:49:24 MST
On 2/8/06, Olie L <neomorphy@hotmail.com> wrote:
>
> Furthermore, the longer it takes to develop an AI that can improve AI (~~
> Seed AI), the more likely it is to create a faster take-off. Which is
> more
> likely to create a "bad" situation.
Though one could argue that the more time goes by without such occurring,
the higher will become the subjective estimated probability that I'm right
about hard takeoff being impossible.
> > Russell Wallace wrote:
> > > > I don't think the Singularity is inevitable.
>
> Key word: Inevitable.
>
> Very very different from "highly likely"
Well, I'm not sure the Singularity is highly likely either. I can't put a
number on it, but the probability that we'll reach it might for all I know
be either more or less than 50%.
>1. De facto world government forms, with the result that progress goes the
> >way of the Qeng Ho fleets. ...
>
> We'll put this under "regulation", then, shall we?
Yes; it wouldn't necessarily have to be a single polity like China after the
Ming Dynasty or Japan under the Tokugawa Shogunate; separate over-regulation
in a sufficiently large proportion of the countries with an advanced
industrial base could have the same effect.
>2. Continuing population crash renders progress unsustainable. (Continued
> >progress from a technology base as complex as today's requires very large
> >populations to be economically feasible.)
>
> This could be categorised more generally as a contributing factor to
> severe
> economic recession.
Yes. A modern chip factory for example costs several billion dollars, and
the cost rises with each generation of semiconductors; this sort of
development is only sustainable with the markets a large, thriving economy
can provide.
Similarly (4) - "total catastrophe" - doesn't have to be anything like an
> existential threat. Sufficient economic recession will impede
> technological
> development, particularly AI development.
Yep. A catastrophe such as a major nuclear war or another Dinosaur Killer
wouldn't wipe us out completely; but it could make recovery slow enough that
other factors such as the next Ice Age could start applying, resulting in a
downward spiral from which we never emerged.
7) Engineering challenges on AGI - a variant on (5) - unforseen limit
>
> I can't say. I don't know that anyone else can reasonably deny with
> sufficient knowledge: There may be impediments that slow the development
> of
> AGI by many many decades. By this stage, other forms of technological
> development may be advanced enough so that the "rapid takeoff" element of
> AGI won't have the same disjunctive impact that it would in the next
> century.
Well, I don't think hard takeoff is possible, so I think 7 definitely
applies. I don't see that as a problem though; a slow takeoff Singularity
could work fine.
- Russell
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT