From: Eliezer S. Yudkowsky (firstname.lastname@example.org)
Date: Wed Apr 11 2001 - 14:30:42 MDT
Somehow, I can't help but imagining this debate as conducted among AIs:
Aiden: When do you think that bioneural networks will finally reach
Ailerin: Despite the so-called "Darwin's Law" about the increase in RAM
and bandwidth, I don't think it'll ever happen. Everyone ignores serial
speed and reliability.
Aiden: Maybe you could make up for that with enough RAM and bandwidth. I
think that's what most of us are hoping, anyway.
Ailerin: Yeah, but look at the numbers. Even the Minims [The smallest
True Citizens --Eliezer] use around ten thousand one-gigahertz processors
operating on a hundred terabytes of RAM. I know the bioneural folks are
big fans of caching, but even with a hundred billion neurons, can you
cache enough to make up for a million-to-one difference in processing
speed? And neurons are a lot messier than transistors, so I'm not sure
you could really get a hundred terabytes of information into a hundred
billion neurons. I mean, that's assuming one kilobyte per neuron.
Aiden: Actually, it's more like a hundred trillion synapses for memory
storage. The transcitizenist community usually uses that figure instead
of the one dealing with individual neurons.
Ailerin: Well, I think they're dreaming. Just because it might take a
byte or even a kilobyte to simulate a synapse doesn't mean you can
reliably store an entire byte of information there.
Aiden: I think the idea is that you spread a single byte of information
over a lot of different synapses, and then you can do this with a lot of
different bytes, so the overall network has stochastic error-correction
without losing information density.
Ailerin: And all this is supposedly being programmed by a greedy local
optimization metaprogrammer? And this in a mind that has no ability to
improve vis own source code?
Aiden: Well, the "evolution" process is actually a bit more complicated
then that, but yes. Also, we like to use "he" to describe biocitizens.
Ailerin: Call me a silicon chauvinist, but I don't buy it.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Thu May 23 2013 - 04:00:20 MDT