RE: AI hardware was 'Singularity Realism'

From: Keith Henson (hkhenson@rogers.com)
Date: Sat Mar 06 2004 - 21:00:18 MST


At 03:22 PM 06/03/04 -0800, you wrote:
>Keith,
>
>I agree with Ben's comments on hardware capability. I personally do not see
>hardware as a show-stopper (even at this stage, and certainly not within
>5-10 years) -- however, I assume that some specialized designs will be
>needed (eg. FPGA).
>
> >....But I don't think anyone right now has a clear idea of what it would
>take
>to build an AI by other means than a simulation of a human brain.
>
>I think that Ben & I (and some others) can make a reasonable case that we do
>have a fairly clear idea: http://adaptiveai.com/research/index.htm

You have the right words in there:

    "The most direct path to solving these long-standing problems is to
conceptually identify the fundamental characteristics common to all
high-level intelligence, and to engineer systems with this basic
functionality, in a manner that capitalizes on human and technological
strength."

But after a quick read, I can't see where you actually have a handle on
"fundamental characteristics common to all high-level intelligence."

For example, under "Cost and Difficulty" near the bottom you have this
statement:

      "Having said this, I do believe that very substantial resources will
be required to scale up the system to human-level storage and processing
capacity."

This is partly wrong and (I think) partly right. Human information storage
is *abysmal.* Cog-Sci and information theory people who have looked at it
in dozens of studies come to the dismal conclusion that accessible human
memories are formed at 6-8 bits per second. Over a lifetime it is
something like 140 Mbytes. [I was wrong too, see below.]

(It is probably a decade since I had a disk that small. Going through old
disks recently I found stuff I had written over ten years ago that I had no
memory of having done *at all.*)

Here is a pointer. Now that I used the net to look this up, I remember
Ralph talking about it at a party.

http://www.merkle.com/humanMemory.html

      "Because experiments by many different experimenters were summarized
and analyzed, the results of the analysis are fairly robust; they are
insensitive to fine details or specific conditions of one or another
experiment. Finally, the amount remembered was divided by the time allotted
to memorization to determine the number of bits remembered per second.

      "The remarkable result of this work was that human beings remembered
very nearly two bits per second under all the experimental conditions.
Visual, verbal, musical, or whatever--two bits per second. Continued over a
lifetime, this rate of memorization would produce somewhat over 10^^9 bits,
or a few hundred megabytes."

Human memory almost certainly has the bit rate and capacity it does because
that was optimal for our Pleistocene ancestors. (Less than 2 bits per
second might not have been enough to remember your way back to camp.)

Computers with tens to hundreds of Gbytes of disk are way, way ahead of people.

[Going into meta mode, I vaguely remembered that people absorb data at a
few bits per second. My memory, as I wrote it down at first, was 6-8 bits
per second. Wanting to cite this (and not being completely certain) I put
"human memory" and "bits per second" in Google and Ralph Merkle's paper
giving the correct number was the first link.]

People sometimes *seem* to remember a lot more. But that's almost
certainly the massive processor power filling in the scenes with "stock
footage" or making it up out of whole cloth. Here is a pointer, you can
find more with "confabulation" in Google.

Confabulation -The New England Skeptical Society's Encyclopedia of ...
... Confabulation. Description: Confabulation is the filling in of gaps in
memory
to make a coherent story. ... Confabulation often occurs during hypnosis. ...
www.theness.com/encyc/confabulation-encyc.html - 2k - Cached - Similar pages

>Directly engineering human-level AI cognitive ability will require
>substantially less computing power than reverse-engineering the brain or
>uploading.

This *looks* like a valid assumption. Ralph even supports it at the end of
his paper:

      ". . . his estimate of memory capacity suggests that the capabilities
of the human brain are more approachable than we had thought. While this
might come as a blow to our egos, it suggests that we could build a device
with the skills and abilities of a human being with little more hardware
than we now have--if only we knew the correct way to organize that hardware."

But one thing that 30 years of studying evolution has finally beaten into
me is that whatever evolved system you are looking at it is probably much
more elegant than you first think it is. I think Ralph was way off about
the processing power a human brain burns, and that Calvin is probably right
about singing cortical columns that spread a thought over the brain
surface, the importance of sequencing and millisecond scale evolution.

We probably get away with relatively little information storage because we
can take a tiny vague memory and fill it in with lots of processing.

Keith Henson

PS. Maybe AI isn't even a desirable goal. Can you imagine a world full of
know-it-all smart asses even worse than me? :-)



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:46 MDT