From: Phil Goetz (philgoetz@yahoo.com)
Date: Fri Sep 09 2005 - 11:31:50 MDT
--- Michael Wilson <mwdestinystar@yahoo.co.uk> wrote:
> Correct. When engineers design systems, they make extensive use of
> state relationships that act as compression functions, mapping a
> large range of input states onto a smaller range of output states
> (strictly, they enforce specific sharp or near sharp set constraints
> on the state of the cause/input and the state of the effect/output).
...
> fuzzy sets that aren't quite disjoint. But because we know that
> on_states is a very small subset of above_threshold, and that
> off_states is a small sharp subset of below_threshold, and that
> these two /are/ disjoint, we can string indefinite numbers of logic
> gates together and reliably predict the final state of any
> cycle-free network. Synapses are similar but less reliable.
A point for the people who believe that neurons are sloppy evolved
things that we can easily improve on:
Making this mapping onto output states consistent, clear, and
error-free requires a lot of engineering overkill. A neuron
operates at a few millivolts, and its output, as a result, is
"wrong" about 10% of the time. A silicon chip operates with
about a 10-volt difference between its two states, in order to
be almost error-free. We humans have gone from a reliability
of .9 to 1-10^-14 or whatever - about a 10% improvement - in
exchange for an increase in power requirements of 4 orders of
magnitude.
Because of the error rate of neurons, you need 1 or 2 orders
of magnitude more of them for reliable computation. But nature
still comes out 2 orders of magnitude ahead in terms of power
dissipation per gate. And that's what really counts.
- Phil
______________________________________________________
Click here to donate to the Hurricane Katrina relief effort.
http://store.yahoo.com/redcross-donate3/
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:52 MDT