From: Eliezer S. Yudkowsky (email@example.com)
Date: Tue Dec 02 2003 - 19:28:11 MST
Eugen Leitl wrote:
> On Tue, Dec 02, 2003 at 01:46:42PM -0500, Eliezer S. Yudkowsky wrote:
>>Taken literally, this seems to me to contradict the second law of
> Spiking models do more computation than continuum models; noisy spiking
> systems do more still. There's something in spikes and noise that go
In an information-theoretic sense, this sounds like a "can't possibly be
right". A noisy spiking model is a special case of a general spiking
model with, say, gaussian noise added as generated by some pseudo-random
algorithm. Anything you can do with a noisy spiking model you can do with
a spiking model. There'll be more information in a spiking model than a
"noisy" spiking model - all you're doing is adding pseudo-random noise and
then forgetting it, i.e., spreading out your probability density function.
Call me a Shannon-worshipper, but I just don't see how you can do more
computation with half a bit than a whole bit.
Now, sadly, it may be the case that noisy spiking models do better in
practice than certain *particular* spiking models hitherto employed, which
do so poorly that their performance can be improved by injecting entropy;
but this reflects a pathetic tendency to design AI algorithms that are
actually worse than random guessing, rather than a violation of the second
law of thermodynamics.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:43 MDT