Re: AI and survival instinct.

From: Carlo Wood (
Date: Wed Apr 03 2002 - 17:03:45 MST

On Wed, Apr 03, 2002 at 08:05:06AM -0700, Ben Goertzel wrote:
> > No, I think that humans are "hardwired" to have sex, but only in ONE
> > way: having sex gives as a tremendous ammount of pleasure.
> It's interesting.... I believe that it feels that way to you, but not to
> me, sorry.

*confused*. What are you sorry for?
Anyway - I don't believe you believe that. Naturally all humans are
basically the same and we (you and I) experience and 'feel' the same.
That I consider an axioma :/

Instead you should have concluded that I left things out and (over)
simplified things. I'll try to point out the essention of my remark

> I believe that "pleasure" is a natural language term that covers a variety
> of different things.

That is an understatement.

> Sure, we can average all these things together to come up with a single
> "weighted-average pleasure indicator"...
> But in reality the weights in the average shift over time according to
> chemical fluctuations in the brain, induced by environmental & social &
> emotional factors, etc.
> Also we seek pleasure over many different time scales, and this causes a lot
> of confusion
> What it comes down to is, at times the human urge for sex seems rather
> uncorrelated from the human urge for pleasure.... People will seek sex
> sometimes, even when greater amounts of pleasure would likely be achieved in
> other ways.

But not as a result of being hardwired to seek sex. The actions needed
to start having sex (including social contacts) are WAY to complex to be
something hardwired.

Our brain is a neural network, it learns by means of feedback.
The input exists of possible actions, memories and associations, the likelihood
of correctness of assumptions being made and the evaluation of the whole
situation as perceived by our preceptions. The output of the neural network
correspond to our decision(s) to do or not do the action on the input.
The learning process means that levels the triggered certain neurons to fire
or not will be changed in a way that either we will do the same thing again
in comparable situation (with the same input) or not.

What I tried to point out is that when we feel something like "pleasure" or
"happiness" then what actually is happening that the path that our neurons
just took are strengthened, and it will be more likely that we will do the
same again next time. That "pleasure" in itself is a very complex and multi-
dimensional feedback wasn't relevant imho; yet you seem to have understood
that I said that "pleasure" is a single bit, or just simple one dimensionial
'value' - no way!

Imho, sex is the result of instinct: imagining to be close to someone feels
"good" (that is hardwired). BECAUSE it feels good our reasoning kicks in
trying to figure out how to do that again.

> The point of all this is: I still think that humans have much more complex
> motivational hardwiring than can be simply explained by positing a
> "happiness" hardwired goal, with hardwired connections between other
> subgoals (e.g. sex) and the happiness goal.

Ok, I said that "becoming happy is the goal of life", but then I meant
"having happy moments is a reason to live".

But if we look at how precisely the mind works, then - in the context of
"goal driven" as used in an earlier post - happiness is not a 'goal' but
state of mind (feeling) that corresponds to positive feedback to our
neurons. As a result our goals will automatically become those things
that will again cause "happiness", but the goal that you set yourself
has more relationship with the Real World then just having the feeling.

Example: "random" action: injecting drugs --> extreme pleasure.
Result: goal is set to "inject drugs again", which together with
reasoning leads to: get more drugs... get money... steal... figure out
how to steal without getting caught etc. The extreme pleasure itself
is NOT the goal that is hardwired however, if it was then we'd all
try drug the first time we hear about it, but we don't because our
neurons have not yet been burned-in; before the actual _experience_
of happiness is it easy not to do it.
[The funny thing is that happiness *is* nothing more then a state of
mind in the most literal sense: there is absolutely no reason needed
to feel happy. As a matter of fact, I taught myself where the trigger
is to feel happy and use it to suppress upcoming anxiety attacks.
Being constantly in this state of mind is also called 'enlightment',
see for example Buddhism. Happiness is triggered however in certain
hardwired cases, among which _relief_ of misery (discomfort, pain, stress).]


Just a question... isn't it more logical to first write an AI
that is as smart as a mouse or something stupid like that?

Carlo Wood <>

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT