From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri Sep 28 2001 - 01:44:25 MDT
Jeff Bone wrote:
>
> > But life is unpleasant, sometimes very unpleasant. And life is not a
> > trade, where you agree to experience some unpleasant things in exchange
> > for the good parts. That is simply a nitwit philosophical idea that was
> > dreamed up to rationalize away the discomfort from living with gratuitous
> > unpleasantness that humanity was powerless to do anything about.
>
> Not defending or attacking that attitude, but: "humanity" as such is powerless
> to do anything at all, because it is an abstraction without any volition or
> capacity to act in / of itself.
In this case, "humanity" was not intended to refer to any higher-level
social phenomena emergent from individual humans, but was simply being
used as shorthand for "the set of all humans". Substitute "99%" for "all"
if you like.
> > Pain is
> > not necessary for growth, or to give life meaning, or to teach us
> > responsibility, or any other nitwit philosophical reason.
>
> Not disputing this point of view, but: prove it. This is an extraordinary
> claim, and extraordinary claims require extraordinary proof. It's contrary to
> most philosophical viewpoints, and someone who seeks to code what they believe
> as a core philosophy into a compulsory "operating system" for the world should
> be ready to defend said philosophy. I do not disagree with you, but I want to
> see your defense.
Well, let's get the Friendliness misapprehension out of the way first; I
am not proposing to compulsorily prohibit all pain. I am proposing that
pain must be voluntary, and that it should be prohibited (by direct
intervention) to cause involuntary, unconsented-to pain in other sentient
beings. Under these local conditions, if most people come to agree with
the philosophical position that pain is unnecessary, then the emergent
result is that most pain will cease to exist.
The requirement to defend the uselessness of pain results from the need to
argue with people who say: "The Singularity will give people the ability
to choose to eliminate pain, and people will make that choice even if it
is unwise, and pain is necessary; QED the Singularity is bad."
I don't have time to argue everything, so I'll confine myself to making
the following observations:
1) Given what we know about cognitive dissonance, and also about what
lousy philosophers humans tend to be, it is quite plausible that all the
philosophy dealing with the necessity of pain is total bull. This
argument doesn't prove that it is bull, but it is sufficient to
demonstrate the invalidity of saying, or even pointing out as an argument
token, "A lot of people believe it so there must be something to it."
2) Pain may sometimes play a causal role in a chain of events leading to
a positive result. Furthermore, albeit more rarely, it may be impossible
to plausibly or easily substitute something else for the pain that would
have the same result. I am thinking particularly here of some emotional
changes that seem to verge on pure neurochemistry. However, if you're an
upload, you can replace any cognitive realizations that result from
biohuman-pain with improved intelligence and the ability to make the
realizations through pure abstract reasoning, and you can replace the
neurochemistry and hardwired emotional linkages by simple fiat.
3) Given (1) and (2), I assert that the ball is now in the other player's
court; they are required to give a specific example of a desirable
scenario involving pain, in which the need for the pain cannot be
eliminated by either improved intelligence or rewired emotions.
> > If you were
> > constructing a world from scratch it would not contain child abuse
>
> Again, not disputing you, but I suppose that depends on who you are, doesn't
> it? My point is not to defend or advance any particular position, but only to
> point out the number of as-yet ungrounded and unexplained assumptions and
> assertions, here.
The variable "you" may be taken to indicate "most people", with additional
focusing for "as rationality increases", "as intelligence increases", and
"philosophical systems that a human third party is likely to wish that I
should pay some attention to".
> > And similarly, the fun I have, regardless of how it measures up compared
> > to an "average" pre-Singularity life,
>
> You say "regardless" but --- interwoven throughout the argument, and previous
> ones --- are assumptions about the Singularity. As you have previously and
> adamantly asserted the unknowability of post-Singularity existance, I find it
> odd that you fall back on arguments which assumes positive (or negative)
> qualitative differences between pre- and post-Singularity life. I would say
> "check your premises" but in this case I think first you should make them
> explicit. We may not even need or want to be individuals post-Singularity, so
> your argument may be as sensible as pro- and eukaryotic cells arguing which form
> is "superior."
Unknowability is not the same as being unable to say "better or worse".
If we don't want to be individuals, and we aren't individuals, that's a
net improvement. This holds for a lot of "If we want X, and X ensues,
that's a net improvement". X can be very widely variable, or even
completely unknown, and still allow a net conclusion of greater
desirability; not an absolute conclusion, of course, but still the more
likely conclusion compared to the negative.
> > Some of the unpleasantness in my life is due to choices I made willingly,
> > but I will not forgive any entity that turns out to be responsible for
> > having made those choices necessary.
>
> So you are not responsible for everything that has ever happened to you?
No, I am responsible. Remember, I think that this world is real,
nonsimulated, and physical. This means that I am the most intelligent
entity in the system that produces my choices, and that my beliefs and
morals and other self-components operate naturally to produce my choices,
which means that I am "responsible" for my choices under the
common-to-virtually-everyone set of hardwired emotional processes that
assign the intuitive perception of moral responsibility.
> Respectfully but gravely, this sounds like the symptoms of a dangerous memetic
> infection called "victimhood." It is unfortunately epidemic in our society.
Well, having called for precision, I will note the precision of your use
of the phrase "sounds like". I would dispute it, but I can hardly argue
with your patently true statement that it sounds to you like victimhood.
I guess your perception of the emotional connotations of the phrasing I
used conflicts with my own perception, because to me "Some of the
unpleasantness in my life is due to choices I made willingly" does not
sound like a confession of victimhood. I had thought the phrasing I used
was unambiguous - I take responsibility for my own choices, but if it
turns out that someone other entity is responsible for reducing my options
and for making unnecessary sacrifices falsely appear to be necessary, I
will be severely annoyed, and it seems very unlikely that entity's actions
could meet any moral standard I care to recognize.
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT