From: Eliezer S. Yudkowsky (firstname.lastname@example.org)
Date: Wed Mar 14 2001 - 14:04:20 MST
Nick Bostrom wrote:
> Steve Nichols wrote:
> >But logically I feel Nick is *wrong* ... since he fails to omit 4. That we
> >could be the ancestors leading the original lives on which any future
> >simulations are based!
> I show that 4 has negligible probability.
Ah, yes, but it doesn't have negligible moral weight. Suppose that
there's one original Eliezer and a billion imitators. The chance that I'm
the original is only one-billionth; however, the actions of that original
would carry a billion times as much weight. The original still
necessarily exists, and we wouldn't want that original to be misled by the
original Nick Bostrom, right? Ergo, the whole possibility may cancel
right out of the moral equation...
Even if it doesn't cancel out, this apparent world of the true Eliezer,
where the whole Singularity is at stake, weighs far more in my
calculations than whatever reasons a hypothetical exoself could have for
wanting to relive my life - even if the probability of being the original
is a billion to one.
Right now, I am Eliezer, whether or not I started out that way. I have
Eliezer's memories and personality. And Eliezer doesn't give a damn about
the pathetic entertainments of citizens who are hanging back and reliving
other people's lives instead of joining the greater transhuman community.
If this is a simulation, then bringing other citizens to that realization
was probably the reason the post-Singularity Eliezer licensed out vis
human life in the first place, if ve did.
So as far as I'm concerned, the possibility that this is all a simulation
doesn't move me. Who gives a damn if my exoself winds up being unhappy?
I didn't force ver to license these memories. As far as I'm concerned,
the actions of the original Eliezer and my current actions at this moment
are isomorphic. I, in my capacity as the original Eliezer or a mental
isomorph of him, attach negligible moral weight to the happiness or
unhappiness of any future folk who want to rerun my life in simulation; if
they wanted to be happy, they should have modified the simulation
Furthermore, if this is a simulation, it seems pretty likely that I'm
living out the exact and original life of Eliezer, as best it could be
reconstructed from his emails or even memories - which means that *this*
email, this *exact* line of reasoning, would have appeared back in the
original pre-Singularity world! Obviously, then, whatever thoughts I am
having right now appear in the original pre-Singularity moral equation,
where they vastly outweigh the consequences to any volunteer exoselves who
only survived in the first place because someone successfully brought off
the original Singularity.
Besides which, I still think this world is probably the real one.
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT