RE: AI and survival instinct.

From: Ben Goertzel (
Date: Wed Apr 03 2002 - 17:37:26 MST

> Anyway - I don't believe you believe that. Naturally all humans are
> basically the same and we (you and I) experience and 'feel' the same.
> That I consider an axioma :/

I don't think this is right.

I think males and females each have lots of experiences that those of
opposite sex can never have, for example.

And, to take another example, there are joys involved in improvising music
that many people will never know...

Nietzsche said, though, that if you look at a set of things through a
sufficiently coarse lense, they will look the same ;)

> > What it comes down to is, at times the human urge for sex seems rather
> > uncorrelated from the human urge for pleasure.... People will seek sex
> > sometimes, even when greater amounts of pleasure would likely
> be achieved in
> > other ways.
> But not as a result of being hardwired to seek sex. The actions needed
> to start having sex (including social contacts) are WAY to complex to be
> something hardwired.

I guess perhaps we are not using the word "hardwired" in the same way.

Of course, the details of sex-seeking behavior are not hard-wired. But the
*goal* of sex-seeking is (although this wiring can be modified too, it's not
*completely* rigid & unmodifiable).

> What I tried to point out is that when we feel something like
> "pleasure" or
> "happiness" then what actually is happening that the path that our neurons
> just took are strengthened, and it will be more likely that we will do the
> same again next time. That "pleasure" in itself is a very
> complex and multi-
> dimensional feedback wasn't relevant imho;

Well, it's relevant if "sexual pleasure" is hardwired as a particular
component of "pleasure", isn't it?

> Just a question... isn't it more logical to first write an AI
> that is as smart as a mouse or something stupid like that?

A lot of people are taking that approach. See for instance Rodney Brooks'
well-known work on subsumption robotics. I don't think it's a stupid
approach. I cosupervised a PhD student once, who built a neural net
simulation of a cockroach brain. But nor do I think it's an optimal

The reason I don't think the approach is optimal is: Creating perceptual and
motor systems comparable to those of a mouse or even a cockroach is very
hard for engineers right now. Thus I think it's easier to create an AI
system whose percepts and actions occur in the Internet, a virtual domain.
This may produce preliminary versions that are less intelligent than humans,
but, they will very different from mice or cockroaches (and comparing their
intelligence to that of mice or cockroaches will be hard -- after all, it's
even hard to say whether a cockroach or Deep Blue is more intelligent,

ben g

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT