Dangers of human self-modification

From: Eliezer Yudkowsky (sentience@pobox.com)
Date: Fri May 21 2004 - 04:35:43 MDT


Eliezer Yudkowsky wrote:
>
> I think I have to side with Keith. I fear that human
> self-modification is far more dangerous than I would once have liked
> to imagine. Better to devise nutritious bacon, cheese, chocolate,
> and wine, than dare to mess with hunger - let alone anything more
> complex. You would practically need to be a Friendly AI programmer
> just to realize how afraid you needed to be, and freeze solid until
> there was an AI midwife at hand to help you *very slowly* start to
> make modifications that didn't have huge unintended consequences, or
> take you away from the rest of humanity, or destroy complexity you
> would have preferred to keep.

Some examples of possible consequences, off the top of my head:

You've got memories of enjoying cheeseburgers. What happens to the
memories when the sensory substrate of recollection shifts? Are you
going to keep the old hardware around for recollection? Will you add in
a complex system to maintain empathy with your old self?

Your old sense of taste was fine-tuned and integrated into your sense of
pleasure and pain, happiness and disgust, by natural selection. Natural
selection also designed everything else keyed into those systems. If
you pick new senses, do they make sense? Does the pattern subtly clash
with the pattern of systems already present?

Will your new sense of taste be more or less complex than your old sense
of taste? More intense or less intense? If more intense, does the new
sense of taste balance with a mental system that is known to stay sane
only under ancestral conditions of environment and neurology? Consider
the effects on humans of non-ancestral Pringles and chocolate cake,
loads of sugar and salt and fat not present in any ancestral foods.
Adopting a more intense taste system can have the same effect, if the
rest of the mind isn't upgraded accordingly to balance with the
increased intensity of sensation.

Maybe you would prefer to gradually grow into new tastes? What does the
sharp discontinuity of direct self-alteration do to your sense of
personal continuity?

If the new taste sensation is more intense, do you become addicted to
the act of self-modification for more intense sensations?

You're eliminating cognitive complexity of yourself by getting rid of
the complex pattern of the old system. Maybe you would prefer not to
eliminate the old complexity - learn to appreciate lettuce *in addition
to* Pringles?

Can you really appreciate the long-term consequences of altering your
mind this way? Does the new design you decided upon make any sense with
respect to those criteria that you would use if you thought about the
problem long enough?

What is the long-term effect of adopting the general policy of
eliminating old complexity that inconveniences you, and inscribing new
complexity that seems like a good idea at the time?

Other humans share your current taste sensations. Think of your awkward
refusal of foods at dinner, the mainstream artistry of cooking you'll no
longer be able to appreciate. Are you distancing yourself from the rest
of humanity? Lest someone chime in that diversity is automatically
good, let me add that this is one hell of a nontrivial decision.

If you can alter your taste buds any time you feel like it, will it
destroy, or alter, the perceived challenge and fun of cooking? Consider
the effect on baseball if people could just run around the bases any
time they wanted.

And finally, what about all the consequences, and categories of
consequences, that you haven't foreseen? When you imagine the act of
self-modification, you will imagine only the easily mentally accessible
consequences of the act, not the actual consequences. Just because you
can't see the doom, doesn't mean the doom isn't there.

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:22:36 MST