Re: [sl4] Re: goals of AI

From: Luke (
Date: Mon Nov 30 2009 - 14:38:14 MST

But what I really meant was that you can't assume a system is easy to
simulate merely because its interface has a low bandwidth. Sure, you can
model the release of a hormone very simply. But how much hormone, and when
is it released? To answer those questions you have to have a human body, or
a system with the same complexity.

To give a rough (but perfectly appropriate) analogy, think of windspeed at
the airport. They've got one monitoring point, so to model the wind all
you'd need is a direction and speed, i.e. two numbers, right? Well, no.
 You need to know how those two numbers vary with time. And that function's
horribly complex.

Or let's say you pick a random point in some fiberoptic cable somewhere.
 There all you need to "simulate" it would be a string of ones and zeros,
each time-indexed, right? No, because you have no way of knowing whether
the next bit will be a one or a zero, without knowing the computer that's
connected to it. Or if that computer's a router, then the computer
connected to that. And so on, until you're simulating the internet. But
then that's not enough because ... well you see it goes on and on.

So sure, you could have your simulated brain and just throw random hormones
at it. But then your simulated Jim would be Jim-on-random-drugs.

Finally, hormones aren't the only means of the body's communicating with the
brain. To give a simple counterexample, we have neurons which measure the
stretch of the stomach lining and send neural (non-hormone) signals about
hunger to the brain.

To take it further, we have all sort of proprioceptive neurons that are in
constant unconscious communication with the brain. That's why you're more
apt to make good decisions after a nice massage, because billions of muscle
fibers are each sending different signals than they would be otherwise.

My thesis, in a nutshell: the body thinks.
 - Luke

On Mon, Nov 30, 2009 at 4:00 PM, Robin Lee Powell <> wrote:

> On Mon, Nov 30, 2009 at 12:56:07PM -0800, Robin Lee Powell wrote:
> > On Mon, Nov 30, 2009 at 03:49:40PM -0500, Luke wrote:
> > > >Hormones are just signals that have a very small
> > > >informational content and travel extraordinarily slowly, if
> > > >electronics can send information in gargantuan quantities at
> > > >the speed of light down a fiber optic cable I fail to
> > > >understand why hormone smoke signals would stump it.
> > >
> > > If hormones have a "very small informational content" then so
> > > do motor neuron impulses coming from the brain.
> >
> > I hate to defend JKC, but, umm, no. Simulating the presence of a
> > hormone requires "only" two things: which hormone, and how much.
> > Now, simulating the brains *response* to the hormone means
> > touching each and every neuron that responds to that hormone. But
> > hormones themselves are just chemicals released into the blood
> > stream; the actual informational content is extremely limited.
> >
> > I suppose you could argue that to be full fidelity you need to
> > simulate the actual molecules being released and travelling around
> > in the blood stream; that's still pretty simple (it's just fluid
> > physics :D) compared to simulating a neuron at the same level of
> > fidelity (the molecular level).
> And now I realize that I misread what you said entirely; you
> specified motor neurons. I'd still say hormones are lower info
> density than that, but not enough to have bothered replying if I had
> read it properly the first time. Sorry.
> -Robin
> --
> They say: "The first AIs will be built by the military as weapons."
> And I'm thinking: "Does it even occur to you to try for something
> other than the default outcome?" See
> ***

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:05 MDT