From: Samantha Atkins (email@example.com)
Date: Mon Jan 29 2001 - 19:44:13 MST
Ben Goertzel wrote:
> Again, no time for a thorough response to your paper, but here's a
> You make a very good case that due to
> -- AI's not evolving in a predator-prey situation
> -- AI's not having to fight for mates
> -- AI's being able to remove from their own brains, things that they find
> -- AI's being able to introspect, and understand the roots and dynamics of
> their behaviors,
> more thoroughly than humans
Actually, I think this presumes a lot. A la Moravec, I have no reason
to believe that AIs will not eventually compete for various types of
resources. Competition should lead to some type of evolutionary path as
the AIs logically try cooperation, agression and other tactics and
evolve to better compete for limited computational space and other
>I do not know that AIs will never find any reason to come to blows with other sentiences.
I do not know that ripping things out of one's mind is so terribly easy
as many things that have objectionable aspects also have beneficial
aspects and it is not so simple to get rid of one without giving up the
> and other related facts, AI's are probably going to be vastly mentally
> healthier than humans,
> without our strong inclinations toward aggression, jealousy, and so forth.
I am not sure that mental health is appropriately delimited when
speaking of a non-human mentality with a radically different context.
There is not a full enough understanding for judging mental health on a
cross-species basis. I suspect that any intelligent being that comes
into competition with others will eventually exhibit something quite
akin to aggression and jealousy.
> But, the case is weaker that this is going to make AI's consistently and
> persistently friendly.
> There are 2 main points here
> AI's may well end up ~indifferent~ to humans. My guess is that even if
> initial AI's are
> explicitly programmed to be warm & friendly to humans, eventually
> "indifference to humans" may become
> an inexorable attractor...
> There WILL be an evolutionary aspect to the growth of AI, because there are
> computer resources and AI's can replicate themselves potentially infinitely.
> So there will be a
> "survival of the fittest" aspect to AI, meaning that AI's with greater
> initiative, motivation, etc.
> will be more likely to survive.
> At least, though, an AI will only need to retain those traits that are
> needed for CURRENT survival;
> unlike we humans, who are saddled with all kinds of traits that were useful
> for survival in some long-past
> situation. This will remain their big advantage, as you point out, in
> slightly different language.
Make that current and projected survival and I will agree.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT