From: Keith Henson (email@example.com)
Date: Fri Jun 02 2006 - 07:18:01 MDT
[Reposted from another list with permission. Nothing new, but an
indication that the local topics are being discussed elsewhere -- Keith Henson]
>Date: 31 May 2006 10:35:31 -0800
>From: dan miller <firstname.lastname@example.org>
>Subject: Re: Moore's Law and AI (Real or Artificial Intelligence): was
>( the following offered as a simulus for discussion and debate; I'm not
>claiming it's scientifically rigorous )
>I think it's possible to put forward a somewhat reasonable estimate of
>computing power necessary to roughly equal human-level intelligence. If we
>look at a typical insect, which has on the order of 20,000 - 200,000 neurons
>(I know, not all neurons are created equal, but this is
>back-of-the-envelope) -- we can ask ourselves, how does this setup compare,
>in terms of "intelligence", to a silicon-based machine that has similar
>[caution: arm-waving begins]
>I conjecture that a typical Darpa GC vehicle represents a similar level of
>complexity in terms of its ability to sense, react, and (to a degree) plan
>its behavior within its environment. Clearly there are many differences,
>but I'm pretty sure it's within an order of magintude one way or the other.
>The GDC vehicles were designed as one-off prototypes, so the technology used
>was not highly optimized for low cost, power consumption, etc. CMU and
>Stanford both used about half a dozen powerful PC's each; but it's obvious
>to me that optimizations, including special-purpose chips or FPGA's, could
>reduce that requirement by at least an order of magnitude.
>So conservatively, a present-day-class, 2+ ghz Pentium-based computer is
>capable of emulating the functional capabilities of something like an ant.
>So 2G mips ~== 20K neurons; one neuron = 100,000 mips
>Humans have on the order of 10^11 neurons; 10^11 * 100,000 = 10^16 mips
>After sketching this out, I looked up Hans Moravec's estimate, which is
>10^14. I guess he's planning to write his neuron simulators in assembly
>My engineer's gut tells me this estimate is an upper limit, and that
>appropriate special-purpose hardware would enable the right sort of
>computational horsepower attainable at reasonable cost within 10 to 15
>It's interesting to note that if the typical guesses are correct, Google is
>just about at this level of computational ability.
>None of this is meant to suggest that the architecture isn't more important
>than the gate count; but it's nice to have some likely upper bounds on what
>kind of performance you might need to get to serious AI.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT