From: Ben Goertzel (email@example.com)
Date: Sun Jan 28 2001 - 11:18:49 MST
I think you sell evolution short... it's a much more general and powerful
process than you
give it credit for
I tend to buy into those theories of the origins of the universe, like John
Wheeler's, which argue
that in the early universe, physical law itself evolved...
Furthermore, I believe that complex high-level cognition is ITSELF
This is a technical point that is at the fore of our AI development within
Webmind Inc. at this
exact moment. We have some nice higher-order inference going on, but, the
key issue is cognition
control. How, in practice, to control the direction of inference, when
applied to complex tasks like
learning schema for acting or language processing, etc.? The heuristics
found in the standard AI
literature just don't work in practice. Evolution, I submit, is the only
effective method for
general cognition control. Evolution forms complex logical relations and
then higher-order inference
verifies their utility or lack thereof. I don't have a mathematical proof
that this is the only effective
way to do things -- so I could be wrong. But
no one has ever found another way, in their theoretical or practical AI
work. And there is much
evidence that the human brain is itself evolutionary (see Edelman's work on
Neural Darwinism; earlier
work by neuroscientists like Vernon Mountcastle and Szentogothai)
> -----Original Message-----
> From: firstname.lastname@example.org [mailto:email@example.com]On Behalf
> Of Eliezer S. Yudkowsky
> Sent: Sunday, January 28, 2001 1:07 PM
> To: SL4
> Subject: Beyond evolution
> Evolution is the simplest way for a system to evolve greater complexity in
> the absence of intelligence. Not the best way - the simplest way. The
> first way hit upon. Evolution prevails, not because it's better, or best,
> or morally right in any way whatsoever, but because it's there and there
> are no forces acting against it.
> Luke: "Is the Dark side stronger?"
> Yoda: "No! no...quicker, easier, more seductive..."
> Evolution is not the brilliant idea of solving the problem more
> effectively by subtracting intelligence; evolution is the result of adding
> the *constraint* that the problem must be solved in a way that does not
> invoke intelligence.
> I'm steeped in the antiteleological precautions of evolutionary
> psychology. You're spending your time with the computer science version
> of evolution - one in which "evolutionary programming" is a brilliant way
> to overcome some of the inherent limits of human intelligence using
> processes that don't invoke general cognition. (Of course, the processes
> *can't* invoke general cognition because nobody's programmed that yet.)
> Evolutionary programming is not superior to a seed AI with a modality,
> 2Ghz transistors, and patience; but EP can reach spaces inaccessible to a
> human, with no modality, who operates in 200hz time and is easily bored
> and is especially bored by simple things; a human who instinctively tries
> to use all those parallel neurons on each design attempt, and has neither
> the patience nor the time to use 10^14 synapses to test lots of simple
> local optimizations.
> Evolution is not something I like. Evolution is something that *is*.
> Moreover, it's a something-that-is that I think humanity (and our new
> friends) should move away from. I think evolution is something we should
> grow out of as we grow up. Evolution is not the best way, or even a good
> way, it's simply the first way. In casting aside evolution, we will lose
> nothing, gain everything, because there is nothing whatsoever that
> evolution can do that can't be done by a sufficiently powerful general
> intelligence. Humans are not "sufficiently powerful", but a seed AI is.
> Evolution is what happens in the *total absence* of morality or
> intelligence. I'm not talking about the evolution of moral and
> intelligent beings, which has been known to happen; I mean that morality
> and intelligence have no influence on the systemic structure of evolution
> itself. Evolution is the way things are because it's the first way that
> unintelligent reality hits upon. Evolution, like death, like pain, like
> the constant struggle to survive, is a part of default-state reality that
> humanity shall CONQUER as we attain our place in the Universe. I believe
> in the *triumph* of altruistic general intelligence over evolution as part
> of the Singularity.
> I think that's the underlying reason why we disagree about the ability of
> evolution to affect the Singularity: you see evolution as a force that's
> strong and necessary; I see it as a force that's pretty weak compared to
> intelligence, and was never all that attractive to begin with.
> > Because all my experimentation with genetic algorithms shows that,
> > for evolutionary processes, initial conditions are fairly irrelevant.
> > The system evolves fit things that live in large basins of attraction,
> > no matter where you start them.
> Well, yeah, sure... in the total absence of moral intelligence. What else
> would you expect?
> -- -- -- -- --
> Eliezer S. Yudkowsky http://intelligence.org/
> Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT