Sadness on voluntary death (was: Singularity Memetics)

From: Eliezer S. Yudkowsky (
Date: Wed Jan 30 2002 - 23:32:27 MST

Ben Goertzel wrote:
> > I'll be deeply saddened if a majority or even a substantial minority of
> > six billion people choose to voluntarily die,
> I'm not sure why you'd be deeply saddened by this, Eli. Your compassion is
> laudable, but I worry that it is accompanied by a certain lack of empathy
> and understanding for world-views very different from your own.

I think that perhaps we have different ideas of why empathy and
understanding are important. When I see a hurt, I empathize with the
hurt, and I want to heal the hurt. I don't say "Gee, what a nice way to
be hurt, I wish I were hurt like that". I don't believe in a cultural
relativism of pain. I respect your right to make your own choices, even
wrong choices. It doesn't make the choices any less wrong.

> It doesn't really harm the overall evolution of mind in the cosmos if this
> happens.

Really? Well, maybe not; my species is too young for me to know that sort
of thing. And I'm not going to make the inevitable comparisons to the
various bad guys of history and their excuses, because so far it sounds
like you're saying this out of genuine respect rather than callous
disregard. But I think we do lose something. And I think we still lose
something even if it's a death by choice. Every human's death diminishes
me; voluntary deaths less than involuntary deaths, but still.

> And is it *bad* for the people involved if it's their choice. I don't see
> why, necessarily.

Because they will die. Because they will hurt. Because they will
diminish with age instead of growing. Because their deaths will be
meaningless, and utterly unnecessary. Because they will have made the
wrong choice under their own moral premises, having refused the
intelligence needed to extrapolate those premises toward their true ends.

> A life lived towards death is different than a life lived towards
> immortality.
> Each type of life has its own aesthetic integrity, in my view.

We've been hurt for so long, and so unavoidably, that it's not suprising
that philosophies spring up that try to diminish the mental hurt by
providing plausible-sounding arguments for the "necessity" of various
gaping wounds in the world. Those arguments are completely bogus and
would never have been invented in a world without those wounds. And I see
your argument for the aesthetic integrity of death and pain as being one
of those bogus arguments.

Nobody, in a world without hatred, would invent hatred for aesthetic

> My wife does not wish to live forever -- she realizes it may well be
> possible,
> but her choice will be to die "naturally." If I do manage to live
> essentially forever,
> as I hope and mostly believe will happen, then I'll sure as fuck miss her,
> but I won't feel sad for her.

My first reaction was "How can you possibly say this and say you love your
wife?" So I guess we really are speaking from fundamentally different
worldviews here.

If the movie "A.I." had one truthful lesson, it's that immortals shouldn't
fall in love with mortals. I hope that your wife chooses to let herself
listen to the transhuman side of the story, because I believe that
transhumanity is the *correct* choice under the moral premises of
virtually everyone, and that anyone who chooses to listen to the truth,
possibly as explained by an (unbiased, non-manipulative) transhuman, will
choose immortality.

> I think we should accept that there is a strong psychological and
> spiritual positive to this feeling of oneness with nature, and not be
> saddened by people whose lives are governed by this feeling, rather than by the
> alternate feeling (that you and I share) that the self is mostly the mind and
> greater individual intelligence is a extremely important goal.

I don't think we share the same feelings at all. My sadness on seeing a
voluntary death stems from a viewpoint that is closer to a "feeling of
oneness" than "individualism". I think the story of the universe gets a
little sadder whenever a mind ends, even voluntarily.

-- -- -- -- --
Eliezer S. Yudkowsky
Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT