From: Eliezer S. Yudkowsky (email@example.com)
Date: Sat May 12 2001 - 01:53:18 MDT
Gordon Worley wrote:
> Maybe the real point is that there is no such thing as a friend for
> evolutionary purposes; friends are only created out of cognition.
> Other are merely associations that there is a benefit to act friendly
> towards, and maintain a faux friendship even when it is not always
> immediately a good idea to be friendly.
No, unconditional friendship/altruism/etc. is evolutionarily real;
friendship is not a strict subgoal in the human mind. See "The Origins of
Virtue" by Matt Ridley.
I'm pretty sure I describe the evolutionary utility of unconditional
emotions in FAI at some point, probably the section on anthropomorphic
injunctions or anthropomorphic ethical injunctions.
> Secondly, you change what you mean by selfish. In the first four
> sections of 4.2.2, selfishness seems to mean having a sense of self
> and, as a result, making decisions where benefit the self the most.
> Then, in 188.8.131.52, selfishness suddenly turns to become greediness.
> You try to seperate the ideas by suggesting human and bacterial
> selfishness, but that isn't very good, because you've overloaded the
> term. Bacteria are greedy, in that they have no sense of self, so do
> whatever has the greatest immediate benefit. Humans have a sense of
> self, and thus think 'hmm, how will this affect *me*?'. There is not
> some kind of unchecked selfishness going on, though, because the
> concepts are different. A person cannot be selfish and greedy at the
> same time: they are mutually exclusive. Well, I should correct
> that: an intelligence can have a sense of self but also be
> irrational and thus be greedy because of an inability to realize the
> effects of having a self, therefore effectively not selfish.
No, "selfishness" is being used in exactly the same sense in both places.
By distinguishing between "selfishness" and "greed" you're assuming the
presence of a bunch of nearby social equals who will stomp you flat if you
get greedy. A transhuman AI has no social equals, and the straight-line
projection says that a selfish transhuman AI with no other cognitive
complexity would act like a bacterium. A selfish human acting like a
bacterium is being foolish, unless there are no other humans to object, in
which case selfishness translates directly to greed.
> Anyway, a lot of this is just a matter of diction, but I think that
> they are very important, especially in a paper like FAI.
I agree, but for the record, I'm a perfectionist anyway; you can scold me
for anything, no matter how trivial.
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT