From: Matt Mahoney (matmahoney@yahoo.com)
Date: Mon Nov 23 2009 - 07:07:39 MST
> Where does this 10**9 (10**15) come from again? Is that the full storage capacity of the brain?
As John Clark noted, there are 10^15 synapses. In a Hopfield net (a vast simplification of the brain), associative memory recall degrades around 0.15 bits per synapse, or 0.3 bits per free parameter because a Hopfield net is symmetric [1].
10^9 bits comes from Landauer's study of human long term memory. It is based on recall tests of spoken and written words, pictures, numbers, and music [2].
I can't explain all of this discrepancy. Some possibilities:
- Redundancy for fault tolerance.
- Landauer's experiments don't account for low level perceptual learning. A lot of neurons and synapses are used in the visual cortex to detect movement, lines, edges, simple shapes, etc. The brain does this in parallel even though a computer could use a much smaller sliding window of filter coefficients.
- Lots of neurons are needed to get a smooth response from on-off signals. For example, each muscle fiber can only be on or off. You need thousands of neurons, each controlling one fiber, to get a smooth movement. Likewise for perceptions. You can distinguish a shift from musical C from C-sharp even though sensory cells in the cochlea have a broad response spanning a couple of octaves.
> what do you think of the conjecture that one could ("sufficiently") characterize a neural network by a set of input/output that would be far smaller than the total storage capacity of the network?
If the information capacity of the brain is 10^9 bits, then you need at least 10^9 bits of compressed training data to characterize it.
1. Hopfield, J. J. (1982), "Neural networks and physical systems with emergent collective computational abilities", Proceedings of the National Academy of Sciences (79) 2554-2558.
2. http://csjarchive.cogsci.rpi.edu/1986v10/i04/p0477p0493/MAIN.PDF
-- Matt Mahoney, matmahoney@yahoo.com
________________________________
From: Luke <wlgriffiths@gmail.com>
To: sl4@sl4.org
Sent: Sun, November 22, 2009 2:23:08 PM
Subject: Re: [sl4] The Jaguar Supercomputer
@Mu In Taiwain:
re: 1) When I said "hogwash", I was referring to the statement "IBM simulated a cat cortex". I wasn't referring to you, or them, or anyone else who might have said it. I was referring to the statement itself. I recognized your uncertainty because you used the word "asserts", which marks the statement as coming from the third-person. You continue to have my respect.
re: 2) What I described would definitely be a particular brain. What about being a "useful human" might not be captured if you were able to capture the behavior of that particular brain?
@Matt Mahoney: Where does this 10**9 (10**15) come from again? Is that the full storage capacity of the brain? Something like (num_synapses * num_possible_weights_per_synapse)? If it is, what do you think of the conjecture that one could ("sufficiently") characterize a neural network by a set of input/output that would be far smaller than the total storage capacity of the network?
Thanks
- Luke
On Sat, Nov 21, 2009 at 4:28 PM, Mu In Taiwan <mu.in.taiwan@gmail.com> wrote:
Luke,
>
>
>1) Viewpoints/arguments I describe are not necessarily the same as viewpoints I hold.
>
>
>
>2) The problem of training a human brain how to be a useful human, is different to the problem of training an artificial neural net, how to work like a human brain (in general, or a particular brain).
>
>
>One may take 16 years, the other, 1 minute or a thousand years.
>
>
>
>
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:05 MDT