From: Eugen Leitl (firstname.lastname@example.org)
Date: Sat Jun 15 2002 - 05:05:28 MDT
On 14 Jun 2002, James Rogers wrote:
> GP is good for finding optimal solutions within well-defined
> boundaries and metrics though, at least in terms of being tractable
> for practical purposes. AFAIK, it fairs much poorer at unconstrained
> discovery type problems.
I agree with this assessment, but current limitations in GP don't mean the
method has a built-in ceiling. (It has produced us, and there are still no
limits in sight what it will produce via us as proxies).
There's a built-in assumption that mutation and selection is all there is
to evolutionary algorithms. However, evolutionary algorithms are a
metamethod which just uses above for bootstrap. Higher order methods are
considerably more complicated, and still largely unknown (GP relates to
evolutionary algorithms as ANNs relate to biological neurons).
If you want to use evolutionary algorithms for nontrivial results, you
need a lot of molecular hardware, efficient mapping of your coding to that
hardware, and a lightning fast fitness evaluation (so you need a good
general physics simulator, or at least the outlines thereof for
bootstrap), and lots of patience, plus a green thumb for picking the right
starter vectors and tweaking the fitness selection.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT