Re: PAPER: Levels of Organization in General Intelligence

From: Eliezer S. Yudkowsky (
Date: Mon Apr 08 2002 - 09:59:19 MDT

Ben Houston wrote:
> I've seen some truly amazing things done in the computational
> pharmacology field dealing with cheap, but massive parallelization.
> Basically, a lot of short cuts are available in the parallelization of
> an algorithm once you've solidified it. In order words making a
> parallel problem solving is difficult and cost in the general case but
> in a specific case it can be quite cheap. The field of computational
> pharmacology is working with special purpose multi-teraflop machines
> that cost less than $1,000,000 US for a year or so now.

If you can run exactly the same algorithm on each of a billion pieces of
data with no interaction between instances of the algorithm, naturally
parallelization is easy.

> >>>>>>>>>>>>>>>>>>>>>>>
> Even if software parallelism were well-supported, AI developers will
> still need to spend time explicitly thinking on how to parallelize
> cognitive processes - human cognition may be massively parallel on the
> lower levels, but the overall flow of cognition is still serial.
> <<<<<<<<<<<<<<<<<<<<<<<
> Cognition, in my opinion, is quite parallel at all levels. There are,
> in my understanding, only a few bottlenecks in the brain that forces
> things to become serial. An obvious example would be the serial nature
> of linguistic output.

One serial bottleneck is enough to render overall consciousness serial. The
human brain, having been rendered serial, is adapted as a whole to serial
deliberation, no matter how many subprocesses are massively parallel.
Sequiturs may run massively parallel searches to find thoughts, but it's
only one thought that wins.

> >>>>>>>>>>>>>>>>>>>>>>>
> We know it is possible to evolve a general intelligence that runs on a
> hundred trillion synapses with characteristic limiting speeds of
> approximately 200 spikes per second.
> <<<<<<<<<<<<<<<<<<<<<<<
> 200 spikes/sec is probably the median for the brain. Some neurons I've
> studied in my courses have upper limits around 1000 spikes/sec.

Hence "approximately".

> Neglect the sensory and motor systems I believe that in the CNS 'S'
> would be upwards of at least 5 as a result of the DAG-like arrangements
> of the signal processing pathways -- ignoring backwards, regulatory
> projections.

'S' is measured with respect to signal propagation speed measured in clock
ticks, not the characteristic number of links. Maybe I should clarify this
in the text. Or just check out Anders Sandberg's original paper.

> >>>>>>>>>>>>>>>>>>>>>>>>
> Memory association may or may not use a "compare" operation (brute force
> or otherwise) of current imagery against all stored memories, but it
> seems likely that the brain uses a massively parallel algorithm at one
> point or another of its operation; memory association is simply a
> plausible candidate.
> <<<<<<<<<<<<<<<<<<<<<<<<
> It seems plausible that the brain uses a resonance-like compare
> function. Basically, a match may be recognized when a neural assembly
> finds its group-firing greatly facilitated as a result of the
> presented/remembered stimulus. Sort of like how a glass will vibrate
> when exposed to its natural resonance frequency.

It's easy to postulate "resonance", and in fact, I actually did. But you
have to explain the specific similarity metric, before postulating
"resonance" as "a compare/similarity/clustering operation of some kind,
implemented on a neural substrate using feedforward and feedback
connections, and synaptic computing in those huge dendritic trees to
establish long-term potentiation with the memory's proper cues" really says
anything more than "a compare operation implemented in the same hardware
devoted to storing the memories"; everyone knows what neural hardware looks

> >>>>>>>>>>>>>>>>>>>>>>>>
> The human brain's most fundamental limit is its speed. Anything that
> happens in less than a second perforce must use less than 200 sequential
> operations, however massively parallelized.
> <<<<<<<<<<<<<<<<<<<<<<<<
> Although the simple firing of neurons represents a lot of the
> information that the brain is processing probably just as much
> information is represented in the dynamic molecular mechanisms of each
> cell. Cells constantly change their gene expression on the order of
> minutes. On the order of seconds in any one neuron there are probably
> dozens on interacting molecular signally cascades that are changing the
> neuron's electrophysiological behavior.

That is correct, but the serial limiting speed is established by the fastest
process within the range of timescales.

> Actually, there is quite a collection of papers in PubMed discussing the
> evidence that the corticothalamic feedback projections play a role in
> image contrast control.

Why do you need ten times as many reciprocal feedback connections as
feedforward connections to do contrast control? I'm not saying that there
are no proposed explanations for the massive feedback connections or that
there are no known functions that require the neuroanatomical backlink, just
that the *massiveness* of the feedback connections has no *standard*
explanation, and the greater computational complexity of feature controller
structure relative to feature detector structure may have something to do
with it.

> It is an accepted fact that working memory, both verbal and spatial, is
> maintained by mutual stimulation between the lateral prefrontal cortex
> and certain posterior association areas:

The existence of working memory is an accepted fact. That working memory is
depictive is an absolutely established fact that is still not entirely
accepted in some GOFAI circles. Whether depictive mental imagery is
governed by the concepts that appear in our internal narrative is another
fight entirely.

> I'm not sure if you mentioned it but did you know that 'verbs' seems to
> be stored in a different brain region that 'nouns'? And that 'noun'
> storage in the brain seems to be organized in a categorized spatial
> manner? Neat stuff eh?

Yes, I know. I'm afraid I didn't have room to mention, in the section on
coevolution of thought and language, how cognitive selection pressures for
different treatment of verbs and nouns based on the different perceptual
structure of verbs and nouns could be responsible for the emergent existence
of Chomskian deep grammar before its evolutionary fixation - as Terrence
Deacon points out, the evolutionary emergence of Chomskian grammar from
strictly linguistic selection pressures is a puzzle because Chomskian
grammar has so many different surface forms; you would expect purely
linguistic selection pressures to fix the computationally simpler surface
structures first. But now I'm saying things that don't make any sense
unless you've read Deacon's "The Symbolic Species" as well as DGI, so I'd
better shut up.

> In your section on "thought" why don't you mention the cognitive
> psychology construct of "working memory"? You seem to describe its two
> part structure perfectly: (1) phonological loop and (2) visuospatial
> sketchpad.

There are a thousand things that DGI does not mention, so don't feel
slighted just because some of your favorite things were left out... As it
happens, though, the phonological loop in working memory is just the
prefrontal refreshment of workspace in the auditory cortex - the internal
narrative may manifest in the same workspace but the phonological loop does
not in itself implement an internal narrative, otherwise tape recorders
would be sentient.

-- -- -- -- --
Eliezer S. Yudkowsky
Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT