From: James Rogers (jamesr@best.com)
Date: Sat Jan 26 2002 - 20:31:34 MST
On 1/26/02 5:13 PM, "Ben Goertzel" <ben@goertzel.org> wrote:
> Coincidentally,
> last night I was doing some simple calculations regarding the time
> complexity of learning computer programs and I obtained some results that
> sort of point in this direction. Of course, it's all very heuristic and
> inconclusive, but it's food for thought...
...
> As I said, this little narrow calculation is in no way conclusive about
> anything general. It's not even directly relevant to my own AI work, since
> both Webmind and Novamente use evolutionary methods only to create *small*
> program trees, which are then pieced together using other techniques
> (association-finding, probabilistic inference). But I share it because it's a
> concrete example of a general *type* of result that we've been vagely talking
> about on this list for a while: Results showing that after a brain/mind
> reaches a certain * very large* size, learning and adapting it becomes
> significantly more difficult.
...
> What this indicates is that there *may* be tricky mathematical issues that
> arise when one tries to make programs vastly exceeding the human brain in
> intelligence. The human brain has about 10^12 neurons which is interesting
> in the light of this calculation....
While I don't have anything definitive to add here, I've noticed the same
pattern when doing back-of-the-envelope calculations for parameter
optimization and design scalability. I've done the calculations on
different aspects than the one mentioned above, mostly for my own amusement
and curiosity, but I too have noticed that some cognitive processes start to
get very difficult as things get into what I would estimate to be
human-level and above. Basically exponential functions that are essentially
linear for small cases but which really hit the ramp when things get
"large", which means n > 10^9 (+/- three orders of magnitude depending on
what I'm looking at). If my numbers mean anything (and they probably
don't), it would kind of indicate that the human brain was in a sweet spot
in terms of bang for the evolutionary buck.
I was going to mention it eventually, but I don't have a solid enough basis
to start throwing around assertions. After all, saying that there might be
fundamental difficulties in getting really smart AI on this list is akin to
saying "fuck" in your Sunday school class. ;-) But since Ben mentioned it,
I'll throw in a "me too". BTW, I always heard that there were ~10^10
neurons in the human brain (or perhaps that is just in the parts that have
some cognitive utility).
That said, I *think* there are some solid tricks for getting around these
problems (at least insofar as what I was working with), but they are nothing
that I would ever expect to evolve in biological wetware.
Cheers,
-James Rogers
jamesr@best.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT