From: Will Pearson (firstname.lastname@example.org)
Date: Tue Apr 30 2002 - 10:21:14 MDT
> 2) I tend to agree with you on this, Will. I imagine that in any
> intelligent self-modifying system there may be significant "goal drift",
> similar to genetic drift in population biology. Not because we will
> necessarily have an evolving population of digital minds (though we might),
> but because in my view cognitive dynamics itself is highly evolutionary in
Agreed. Although you could view having a population of these things as expanding the population size of possible actors, which should be a good thing. You might also get the benefits of different evolutionary pathways, ala island model GA's.
> Eliezer's view of cognition is a bit different, which partially
> explains his different intuitive estimate of the probability of significant
> goal drift.
I think I might be the opposite end of the scale with you in the middle and Eliezer at the other end. I wonder how catastrophic the drift might become, for example the goals might drift to a point where it thinks creating a virus is a good idea or something else that is detrimental to it's own intellgence.
A population of these might minimise the risk, but one accidental deviant could have a big affect on the others.
I think the dynamics of these sorts of systems are so complex they will have to be tested, in order to convince people one way or the other. Bt for the while I shall work on non-self-modifying goal systems, with the rest be self-modifiable.
-- _______________________________________________ Sign-up for your own FREE Personalized E-mail at Mail.com http://www.mail.com/?sr=signup
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT