RE: Volitional Morality and Action Judgement

From: Ben Goertzel (ben@goertzel.org)
Date: Mon May 24 2004 - 13:01:04 MDT


> Marc Geddes wrote:
> > More likely any general intelligence necessarily has
> > have: a 'self', consciousness,

Well, I think that consciousness will naturally be part of any
intelligence.

As for self, I agree that self-modifying AI, or environmentally embodied
AI, requires that the AI model itself with at least modest accuracy. A
powerful non-self-modifying AI that didn't need to interact in any
environment MIGHT be possible to build, and if so, might not need a
self. Not sure if the latter is possible but it doesn't seem a priori
impossible.

> some degree of
> > observer centeredness, some non-altrustic aspects to
> > its morality, some input from the 'personal' level
> > into its morality, and helping the world would only be
> > a *secondary* consequence of it's main goals.

These latter things don't seem to be intrinsic to general intelligence,
though they're going to be part of any AI that learns to think via
questing for survival in an environment.

-- Ben G



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT