RE: supergoal stability

From: Ben Goertzel (ben@goertzel.org)
Date: Fri May 03 2002 - 19:39:55 MDT


> > > The Singularity is not a complex nonlinear dynamical
> > > process - it is alive
> >
> > I don't see how you can say the Singularity is "alive." How are you
> > defining life? Normally it is defined in terms of metabolism and
> > reproduction.
>
> "Life is anything designed primarily by evolution, plus all beings above a
> certain level of intelligence." This is my old definition, which I still
> like today because it manages to include cows, viruses, mules, humans, and
> AIs.

OK, but by this definition, is the Singularity alive? I don't see why.

A Singularity is neither designed by evolution, nor intelligent.

> > Eliezer, "a Friendly AI reasoning about morality is A NONLINEAR DYNAMIC
> > SYSTEM" is not intended by me as a METAPHOR.
> >
> > It is actually a precise mathematical statement, using commonly defined
> > mathematical terms.
>
> But you were using it as a metaphor rather than as a mathematical
> statement. You pointed out the butterfly effect in weather systems, which
> is a behavior for one kind of nonlinear dynamic system, and drew a
> correspondence to a Friendly AI modifying its own goals. In other words,
> you cited a prototypical example of a nonlinear dynamic system and argued
> that a Friendly AI, which is a *very special* kind of nonlinear dynamic
> system, will inherit the stereotypical qualities attributed to
> prototypical
> examples of nonlinear dynamic systems: UNPREDICTABILITY,
> UNCONTROLLABILITY,
> NONINTENTIONALITY. That's a metaphor if ever I saw one. Humans are
> nonlinear dynamic systems, mathematically speaking, but an extremely
> noncentral case of nonlinear dynamic systems, one in which it is useful to
> attribute intentionality to the system's final state because the system
> *does in fact* have intentions.

Yeah, of course, weather systems are an accurate metaphor only for very
limited aspects of mind process.

However, they are not quite as bad a metaphor as you imply, in my view.

Like weather systems, human minds are hard to predict (except in a very
rough statistical way), hard to control (but not impossible), and display a
complex assemblage of strange attractors and complex transients.

Of course, there are very many differences between human minds and weather
systems as well.

And it's quite reasonable to maintain that a digital mind will be *less*
chaotically unpredictable than the human mind, though I'm not so sure of
this myself.

-- ben



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT