From: Mike Dougherty (msd001@gmail.com)
Date: Fri Feb 24 2006 - 20:42:44 MST
Please explain.
N nodes in a binary tree are traversed very differently than the same N
nodes in a simple linked list. If the nodes contain binary tree links as
well as a straight-path list of links, a new traversal method is
introduced. If these nodes are neurons and the operation of the brain is
capable of creating and modifying dendrites/axons it would seem that the
opportunity to increase complexity is immense. If each node visit stores a
copy of every previous traversal such that each node contains the history of
every previous traversal touching that node, the "direction" of the next new
node connection can be computed. If the computation requires the analysis
of the state of every other non-current node, the computational complexity
increases with every iteration. This would not seem to be desirable for
"us" to get a result OUT of this system. If the software recursion is based
on a computronium-substrate, then perhaps the increasing information density
is compensated by an increase in bandwidth due to the increased
interconnectivity. I'm thinking of a Menger
Sponge<http://en.wikipedia.org/wiki/Menger_sponge>where surface area
corresponds to information density and volume corresponds
to retrieval latency. Maybe that isn't relevant to anything, but it's an
interesting mental picture :)
On 2/22/06, Ben Goertzel <ben@goertzel.org> wrote:
>
> Eliezer,
>
> What I suggested is that it is impossible for a program/computer
> combination to recursively self-improve its hardware and software in
> such a way that it can both
>
> a) increase dramatically the algorithmic information of its software
>
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT