Re: Time and Minds

From: aominux (
Date: Fri Sep 28 2001 - 15:26:19 MDT

No matter how disgustingly sick this sounds, after spending a certain amount
of time in a hedonistic environment you WILL begin to desire pain. It
doesn't matter whether or not you can conceive this now.

I doubt any of us could go million subjective years before desiring a
pre-singularity simulation, let alone an infinity. The sick truth is that
we don't desire happiness; we desire sensory stimulation.

Besides, since you could consume an entire subjective human life in a
fraction of a second in post-singularity times, it wouldn't seem like much
of a sacrifice. We will/do desire variety and we will get it.

On a related note, the odds that we are not in a simulation are impossible
to compute since we would have to assume the that we were in a simulation.
However, if we trust that our world is based on a backup of the future's
past, it doesn't take much time to figure the probability of a single human
life being in the "real world." Humans have been around for far under one
million years and we still have billions of years until our sun expires.
Many many many more humans will exist beyond this point in history than
before it. You chances of experiencing pre-singularity firsthand are slim
to none.

Even this is pessimistic because this is to say that when the sun expires
humanity will pass with it. However, once the singularity is achieved, we
will undoubtedly move out of the galaxy anyhow.

Finally, I would appreciate it if you would stop using Re: Time and Minds
if you're post has nothing to do with the original purpose of this topic:
Minds and time travel.

Sincerely apologizing for grammatical errors and sl4 violations,

----- Original Message -----
From: "Eliezer S. Yudkowsky" <>
To: <>
Sent: Friday, September 28, 2001 2:44 AM
Subject: Re: Time and Minds

> Jeff Bone wrote:
> >
> > > But life is unpleasant, sometimes very unpleasant. And life is not a
> > > trade, where you agree to experience some unpleasant things in
> > > for the good parts. That is simply a nitwit philosophical idea that
> > > dreamed up to rationalize away the discomfort from living with
> > > unpleasantness that humanity was powerless to do anything about.
> >
> > Not defending or attacking that attitude, but: "humanity" as such is
> > to do anything at all, because it is an abstraction without any volition
> > capacity to act in / of itself.
> In this case, "humanity" was not intended to refer to any higher-level
> social phenomena emergent from individual humans, but was simply being
> used as shorthand for "the set of all humans". Substitute "99%" for "all"
> if you like.
> > > Pain is
> > > not necessary for growth, or to give life meaning, or to teach us
> > > responsibility, or any other nitwit philosophical reason.
> >
> > Not disputing this point of view, but: prove it. This is an
> > claim, and extraordinary claims require extraordinary proof. It's
contrary to
> > most philosophical viewpoints, and someone who seeks to code what they
> > as a core philosophy into a compulsory "operating system" for the world
> > be ready to defend said philosophy. I do not disagree with you, but I
want to
> > see your defense.
> Well, let's get the Friendliness misapprehension out of the way first; I
> am not proposing to compulsorily prohibit all pain. I am proposing that
> pain must be voluntary, and that it should be prohibited (by direct
> intervention) to cause involuntary, unconsented-to pain in other sentient
> beings. Under these local conditions, if most people come to agree with
> the philosophical position that pain is unnecessary, then the emergent
> result is that most pain will cease to exist.
> The requirement to defend the uselessness of pain results from the need to
> argue with people who say: "The Singularity will give people the ability
> to choose to eliminate pain, and people will make that choice even if it
> is unwise, and pain is necessary; QED the Singularity is bad."
> I don't have time to argue everything, so I'll confine myself to making
> the following observations:
> 1) Given what we know about cognitive dissonance, and also about what
> lousy philosophers humans tend to be, it is quite plausible that all the
> philosophy dealing with the necessity of pain is total bull. This
> argument doesn't prove that it is bull, but it is sufficient to
> demonstrate the invalidity of saying, or even pointing out as an argument
> token, "A lot of people believe it so there must be something to it."
> 2) Pain may sometimes play a causal role in a chain of events leading to
> a positive result. Furthermore, albeit more rarely, it may be impossible
> to plausibly or easily substitute something else for the pain that would
> have the same result. I am thinking particularly here of some emotional
> changes that seem to verge on pure neurochemistry. However, if you're an
> upload, you can replace any cognitive realizations that result from
> biohuman-pain with improved intelligence and the ability to make the
> realizations through pure abstract reasoning, and you can replace the
> neurochemistry and hardwired emotional linkages by simple fiat.
> 3) Given (1) and (2), I assert that the ball is now in the other player's
> court; they are required to give a specific example of a desirable
> scenario involving pain, in which the need for the pain cannot be
> eliminated by either improved intelligence or rewired emotions.
> > > If you were
> > > constructing a world from scratch it would not contain child abuse
> >
> > Again, not disputing you, but I suppose that depends on who you are,
> > it? My point is not to defend or advance any particular position, but
only to
> > point out the number of as-yet ungrounded and unexplained assumptions
> > assertions, here.
> The variable "you" may be taken to indicate "most people", with additional
> focusing for "as rationality increases", "as intelligence increases", and
> "philosophical systems that a human third party is likely to wish that I
> should pay some attention to".
> > > And similarly, the fun I have, regardless of how it measures up
> > > to an "average" pre-Singularity life,
> >
> > You say "regardless" but --- interwoven throughout the argument, and
> > ones --- are assumptions about the Singularity. As you have previously
> > adamantly asserted the unknowability of post-Singularity existance, I
find it
> > odd that you fall back on arguments which assumes positive (or negative)
> > qualitative differences between pre- and post-Singularity life. I would
> > "check your premises" but in this case I think first you should make
> > explicit. We may not even need or want to be individuals
post-Singularity, so
> > your argument may be as sensible as pro- and eukaryotic cells arguing
which form
> > is "superior."
> Unknowability is not the same as being unable to say "better or worse".
> If we don't want to be individuals, and we aren't individuals, that's a
> net improvement. This holds for a lot of "If we want X, and X ensues,
> that's a net improvement". X can be very widely variable, or even
> completely unknown, and still allow a net conclusion of greater
> desirability; not an absolute conclusion, of course, but still the more
> likely conclusion compared to the negative.
> > > Some of the unpleasantness in my life is due to choices I made
> > > but I will not forgive any entity that turns out to be responsible for
> > > having made those choices necessary.
> >
> > So you are not responsible for everything that has ever happened to you?
> No, I am responsible. Remember, I think that this world is real,
> nonsimulated, and physical. This means that I am the most intelligent
> entity in the system that produces my choices, and that my beliefs and
> morals and other self-components operate naturally to produce my choices,
> which means that I am "responsible" for my choices under the
> common-to-virtually-everyone set of hardwired emotional processes that
> assign the intuitive perception of moral responsibility.
> > Respectfully but gravely, this sounds like the symptoms of a dangerous
> > infection called "victimhood." It is unfortunately epidemic in our
> Well, having called for precision, I will note the precision of your use
> of the phrase "sounds like". I would dispute it, but I can hardly argue
> with your patently true statement that it sounds to you like victimhood.
> I guess your perception of the emotional connotations of the phrasing I
> used conflicts with my own perception, because to me "Some of the
> unpleasantness in my life is due to choices I made willingly" does not
> sound like a confession of victimhood. I had thought the phrasing I used
> was unambiguous - I take responsibility for my own choices, but if it
> turns out that someone other entity is responsible for reducing my options
> and for making unnecessary sacrifices falsely appear to be necessary, I
> will be severely annoyed, and it seems very unlikely that entity's actions
> could meet any moral standard I care to recognize.
> -- -- -- -- --
> Eliezer S. Yudkowsky
> Research Fellow, Singularity Institute for Artificial Intelligence

Do You Yahoo!?
Get your free address at

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT