From: Dani Eder (firstname.lastname@example.org)
Date: Mon Jun 25 2001 - 12:38:23 MDT
> Ray Kurzweil estimates 20,000 TF/s for human
> brain power, based on a nominal neural-firing speed
> of 200 instructions per second for each connection
> (100 TF/s x 200). This is 200 times higher
> than Hans Moravec's figure, which doesn't include
> neural-firing speed.
My own estimate is 3000 TF/s derived by assuming that
a high fidelity neural model is required to produce an
AI. Neuron signals are basically basically the same
once an action potential is generated. The fact that
a synapse fires represents one bit of information if
the time of arrival isn't relevant, and about 3 if it
is. On the receiving end, the next neuron interprets
the synapse as inhibitory or exitory, with a strength
that might be a byte of data to represent. The
contribution to the "should I fire" calculation for
each synapse is then
(synapse strength)(sum of recent received pulses)
which is a flop or two. Times 100 Hz average firing
rate times 10k synapses/neuron x 10^11 neurons =
100k Teraflops. With code optimization (i.e. a
neural sim isn't the most efficient way to do an AI)
and the fact
that some of the human brain is used to run the body
and process sensory data that an AI will likely
have pre-processed, I reduce this to 3,000 TF/s.
If you want to put it as a range, I would say it is
unlikely that a human-level AI will require cpu power
outside the range of 100 to 100,000 Teraflops.
Remember that a range of 1000-fold is 10-20 years
of Moore's Law operation.
Do You Yahoo!?
Get personalized email addresses from Yahoo! Mail
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT