Re: Edge.org: Jaron Lanier

From: Eugen Leitl (eugen@leitl.org)
Date: Wed Dec 03 2003 - 02:42:22 MST


On Tue, Dec 02, 2003 at 09:28:11PM -0500, Eliezer S. Yudkowsky wrote:

> In an information-theoretic sense, this sounds like a "can't possibly be
> right". A noisy spiking model is a special case of a general spiking

This is possible. I might be mistaken, but don't have the time
to search for references. I don't think it's liquid state machine
or http://www.istia.univ-angers.fr/~chapeau/papers/europl96.pdf

If anybody here has some time to query a search machine for relevant
papers I'd appreciate a list of your findings.

> model with, say, gaussian noise added as generated by some pseudo-random
> algorithm. Anything you can do with a noisy spiking model you can do with

A noisy spiking model which directly uses hardware noise uses less resources
than a deterministic spiking model with an algorithm-generated noise.

Hardware structures can directly embody algorithms. This can be a tremendous
advantage if it's the right kind of algorithm in a constrained resource
(hardware, energy) situation. We're already approaching the noise floor even
in conventional photolitho semiconductor substrate, nevermind molecular
components which are terribly noisy.

Software isn't free. It takes switches, or a CPU, memory, and context switch
overhead plus a time slice. We already have hardware RNG in some mainstream
CPUs because entropy is hard to get at in a deterministic system.

> a spiking model. There'll be more information in a spiking model than a
> "noisy" spiking model - all you're doing is adding pseudo-random noise and
> then forgetting it, i.e., spreading out your probability density function.
> Call me a Shannon-worshipper, but I just don't see how you can do more
> computation with half a bit than a whole bit.

You might be right. I don't have time to dig it out.
 
> Now, sadly, it may be the case that noisy spiking models do better in
> practice than certain *particular* spiking models hitherto employed, which
> do so poorly that their performance can be improved by injecting entropy;
> but this reflects a pathetic tendency to design AI algorithms that are
> actually worse than random guessing, rather than a violation of the second
> law of thermodynamics.

You seem to be a determinism-worshipper, to the exclusion of
stochastically-driven algorithms (which don't have anything with guessing).
Which is interesting, because you're the latter at the implementation layer.

I'm not saying it's the best way to do things, always, but we currently
don't know any better for the AI problem domain.

I wish I had more time for this thread, but email once again takes hours
of a day which I don't have.

-- Eugen* Leitl leitl
______________________________________________________________
ICBM: 48.07078, 11.61144 http://www.leitl.org
8B29F6BE: 099D 78BA 2FD3 B014 B08A 7779 75B0 2443 8B29 F6BE
http://moleculardevices.org http://nanomachines.net





This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:43 MDT