From: Matt Mahoney (matmahoney@yahoo.com)
Date: Mon Nov 23 2009 - 09:20:47 MST
When I say that the brain has 10^9 bits of memory, I mean its Kolmogorov complexity. There are 2^(10^9) possible brains you can distinguish by their behavior. (It happens to take 10^15 synapses to achieve that, however). So the Kolmogorov complexity of the desired outputs for any traning set also has to be at least 10^9 bits, or else there would be some brains that can't be distinguished.
If your goal is to produce *a* brain (say, to pass the Turing test), and not a copy of some particular brain, then I suppose you could get by with less.
-- Matt Mahoney, matmahoney@yahoo.com
________________________________
From: Luke <wlgriffiths@gmail.com>
To: sl4@sl4.org
Sent: Mon, November 23, 2009 9:47:38 AM
Subject: Re: [sl4] The Jaguar Supercomputer
Thanks for the sources, Matt. I see now what you mean w/ regard to the information capacity being smaller than I might have anticipated.
However, you said: "If the information capacity of the brain is 10^9 bits, then you need at least 10^9 bits of compressed training data to characterize it."
This seems like a very reasonable assumption, but only because it's a simple statement. If we were talking about hard drives I'd be convinced.
Consider this: given a specific behavior set (i.e. desired stimulus-response pairings), are there not many different sets of weights which could all accomplish that behavior? If this is the case, then it would stand to reason that something less than the entire informational capacity of the system would be necessary to "sufficiently" characterize it, i.e. to characterize the stimulus-response pairings.
That last statement assumes that we're talking about the "informational capacity" of all of the synapses. But it sounds like all of the measurements that come up with this 10**9 ; 10**15 have been from the edge anyway, so they may actually be the numbers we're looking for. In other words, what we need to specify is the interfaces, not the nuts-and-bolts that make those interfaces work.
- Luke
On Mon, Nov 23, 2009 at 9:07 AM, Matt Mahoney <matmahoney@yahoo.com> wrote:
> Where does this 10**9 (10**15) come from again? Is that the full storage capacity of the brain?
>
>
>As John Clark noted, there are 10^15 synapses. In a Hopfield net (a vast simplification of the brain), associative memory recall degrades around 0.15 bits per synapse, or 0.3 bits per free parameter because a Hopfield net is symmetric [1].
>
>
>10^9 bits comes from Landauer's study of human long term memory. It is based on recall tests of spoken and written words, pictures, numbers, and music [2].
>
>
>I can't explain all of this discrepancy. Some possibilities:
>
>
>- Redundancy for fault tolerance.
>- Landauer's experiments don't account for low level perceptual
> learning. A lot of neurons and synapses are used in the visual cortex to detect movement, lines, edges, simple shapes, etc. The brain does this in parallel even though a computer could use a much smaller sliding window of filter coefficients.
>- Lots of neurons are needed to get a smooth response from on-off signals. For example, each muscle fiber can only be on or off. You need thousands of neurons, each controlling one fiber, to get a smooth movement. Likewise for perceptions. You can distinguish a shift from musical C from C-sharp even though sensory cells in the cochlea have a broad response spanning a couple of octaves.
>
>
>> what do you think of the conjecture that one could ("sufficiently") characterize a neural network by a set of input/output that would be far smaller than the total storage capacity of the network?
>
>
>If the information capacity of the brain is 10^9 bits, then you need at
> least 10^9 bits of compressed training data to characterize it.
>
>1. Hopfield, J. J. (1982), "Neural networks and physical systems with emergent collective computational abilities", Proceedings of the National Academy of Sciences (79) 2554-2558.
>
>
>2. http://csjarchive.cogsci.rpi.edu/1986v10/i04/p0477p0493/MAIN.PDF
>
>
>
>-- Matt Mahoney, matmahoney@yahoo.com
>
>
>
>
>
________________________________
From: Luke <wlgriffiths@gmail.com>
>To: sl4@sl4.org
>Sent: Sun,
> November 22, 2009 2:23:08 PM
>
>Subject: Re: [sl4] The Jaguar Supercomputer
>
>
>@Mu In Taiwain:
>
>
>re: 1) When I said "hogwash", I was referring to the statement "IBM simulated a cat cortex". I wasn't referring to you, or them, or anyone else who might have said it. I was referring to the statement itself. I recognized your uncertainty because you used the word "asserts", which marks the statement as coming from the third-person. You continue to have my respect.
>
>
>re: 2) What I described would definitely be a particular brain. What about being a "useful human" might not be captured if you were able to capture the behavior of that particular brain?
>
>
>
>
>@Matt Mahoney: Where does this 10**9 (10**15) come from again? Is that the full storage capacity of the brain? Something like (num_synapses * num_possible_weights_per_synapse)? If it is, what do you think of the conjecture that one could ("sufficiently") characterize a neural network by a set of input/output that would be far smaller than the total storage capacity of the network?
>
>
>Thanks
> - Luke
>
>
>On Sat, Nov 21, 2009 at 4:28 PM, Mu In Taiwan <mu.in.taiwan@gmail.com> wrote:
>
>Luke,
>>
>>
>>1) Viewpoints/arguments I describe are not necessarily the same as viewpoints I hold.
>>
>>>>
>>
>>2) The problem of training a human brain how to be a useful human, is different to the problem of training an artificial neural net, how to work like a human brain (in general, or a particular brain).
>>
>>
>>One may take 16 years, the other, 1 minute or a thousand years.
>>
>>
>>
>>
>
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:05 MDT