From: Richard Loosemore (firstname.lastname@example.org)
Date: Wed Jan 25 2006 - 08:35:18 MST
But (and this is also directed at CyTG's comments) .....
1) If you want to only talk about simulating a brain, there are so many
imponderables still left to be resolved (current neural net learning
algorithms are not adequate models of what goes on in a real synapse,
for example, and we believe there could be substantial computing
happening in the relatively timing of pulse arrival out there in the
dendritic tree) that the assumptions you've made about the computing
required at each neuron could be a few orders of magnitude to small.
The uncertainty is so huge that these calculations hardly even count as
ballpark estimates, sadly.
2) If you believe, as I do, that merely emulating the structure and
function of the brain is the wrong way to go (too hard to figure out
what the exact computation IS, down at that fine-grained level), then
you would say that all such calculations are a waste of time. I repeat
what I said earlier: if we had all the computational resources to
emulate a brain *today* it might still take another 35 years or 95 years
to get enough information to be able to do it.
Instead, take a long hard look at cognitive science and AGI research.
Here, we try to understand the functional architecture of minds. We
have a good chance of understanding *what* the functional units actually
should do to make a mind work. Those functional units might be
implemented in a brain with ten thousand neurons each, but when I
understand what the unit is doing, I might find that I can get away with
a code implementation that takes just a small amount of computing,
perhaps with no floating point calculations whatsoever! Faced with that
possibility, we might find that we are only a couple of years away from
being able to figure the mechanism, and that we already have enough
power in a roomful of server blades.
Now *that* is the exciting part, as far as I am concerned.
Rok Sibanc wrote:
> My own calculations.
> Brain: 10^11 neurons * 10^3 conn per neuron * 10^2 Hz = 10^16 connection
> updates per sec
> Desktop computer (3 gflop, 3flops per connection update assumed): 10^9
> connection updates per sec
> 10^7 factor.
> 10^7 = 2^n (assuming Moore's law)
> n ~ 23 * 1.5 years ~ 35 years for normal computer to reach computing
> speed of brain.
> Looking at top500.org <http://top500.org> list. first computer (blue
> gene ~ 300000 gflops = 3*10^14) is only about 100x slower than brain.
> 100 time increase in computer power is achievable in about 10 years
> (Moore's law again).
> On 1/25/06, *Russell Wallace* <email@example.com
> <mailto:firstname.lastname@example.org>> wrote:
> On 1/24/06, *Rok Sibanc* <email@example.com
> <mailto:firstname.lastname@example.org>> wrote:
> 1 million neurons/second... beacause every neuron has to do
> multiplication of every incoming input and sumation of all
> weighted inputs and finally sigmoid function which is expressed
> with taylor's polynomial.
> Well, CyTG compared the 1 million with (a very conservative estimate
> of) 1E14 for the brain, which suggests he was referring to updates
> of individual synapses. If he meant 1 million neurons of 1000
> synapses each, then that would make sense, but the comparison is
> then not valid, and the computer comes out looking better than had
> been suggested.
> - Russell
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT