From: Charles D Hixson (email@example.com)
Date: Tue Apr 04 2006 - 16:23:35 MDT
On Tuesday 04 April 2006 01:22 pm, Russell Wallace wrote:
> On 4/4/06, Philip Goetz <firstname.lastname@example.org> wrote:
> > Yes - what do you mean "it will turn out somewhat higher once you take
> > in all the nuances, it always does"? I don't understand that
> > statement at all. It seems to me that you can't say "it always does"
> > when there is only one phenomenon, not a class of them, under study.
> What I mean is that I was looking at the 200 Hz figure and deciding whether
> to call it 10^2 or 10^3 for the sake of an order of magnitude estimate, and
> remembering that whenever we investigate the complexity of some aspect of
> the brain or biology in general, 99% of the time it turns out higher rather
> than lower than we thought (and serious attempts to simulate biological
> neurons rather than artificial "neural nets" last I heard were using very
> large amounts of computing power, more than synapse count * 200 Hz, though
> I forget why) so 10^3 struck me as more plausible.
I'm sure you're right for an initial estimate. After things get working and
start being understood, however, expect to be able to pare at least an order
of magnitude off the estimate for the initial requirement.
How much computing power you need is partially determined by how efficient
your models are. If, e.g., you use a neural model of a clock, you will end
up with both a clock that's rather inefficient, and an excessive number of
neural connections. When it comes to other features, however, e.g., possibly
pattern matching, our current estimates may be either right on the money, or
even a bit low. Still, some neural connections are pretty clearly time delay
loops, and those can be done much more efficiently that by neural simulation.
This implies that there exist other features that once we understand, we will
be able to simulate without replicating all the circuitry.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT