From: Robin Lee Powell (firstname.lastname@example.org)
Date: Tue Apr 01 2003 - 17:18:45 MST
On Mon, Mar 24, 2003 at 01:21:52AM -0800, Lee Corbin wrote:
> Robin posted his essay
> > http://www.digitalkingdom.org/~rlpowell/rants/simulation_errors.html
> Here are some of my reactions.
> > I simply think that an intelligence with the capacity to keep
> > the state of billions of other intelligences in its head (and
> > collate and manage those states and the states of those
> > intelligences' environments and so on and so on) would be many,
> > many times more complicated than all of the intelligences it is
> > managing combined.
> This is certainly a very good point, and must affect anyone's
> estimate of *when* such a simulation will be possible.
> The most difficult point of the paper, and an admittedly difficult
> point of Nick Bostrom's thesis, is attempting to gather some idea
> of the amount of compute power necessary. I believe that Nick
> said that one would have to simulate down to the level of atoms,
> and you are saying that it must go beyond that.
No. Nick said that the computing power for simulating things other
than human minds was irrelevant. Or at least, he though it was
irrelevant enough to leave it out of his essay. *I* was the one
> In just the same way, suppose that you *are* living in a
> simulation. Just on what are you basing your intuition concerning
> the accuracy of the simulation so far? Only on your memories, and
> they're obviously under the control of this hyper-intelligent
Ah, that's a sneaky one. Hadn't thought of that.
But again, how much effort/time on the part of that entity would be
> From this and similar arguments it follows that the actual compute
> power necessary to render your current experience has been greatly
No, what actually follows is that none of us have a clue.
> Moral Issues
> In a not unrelated way, the same thing applies to one's notions of
> exactly how much pain was caused by the Mongol invasion of China,
> because when it really comes down to it, you can testify about no
> one's pain but your own. Suppose that someone you love, however,
> is badly mangled in an automobile accident, and apparently
> undergoes great suffering. Note that at any particular moment you
> are actually communicating with this individual, most of the
> testimony as to the person's suffering has to do with the
> individual's memories of that suffering. Perhaps the only actual
> suffering that ever occurs happens only at the precise moments
> when you are engaging certain of this loved one's faculties, and
> that at other times, only the *memories* of suffering are being
> Therefore not only is this* possibly a near pain-free emulation,
> but I'm surprised that theologians haven't used a form of the
> above argument to enable them to suppose that God did after all
> create the best of all possible worlds, vindicating Dr. Pangloss.
You're saying that as long as I didn't actually undergo having my
spine crushed, but only remember it as though I did, it's morally
OK for a being to have created a universe where I could
I find that completely absurd, sorry.
: I fell off a cliff.
-- http://www.digitalkingdom.org/~rlpowell/ *** I'm a *male* Robin. .i le pamoi velru'e zo'u crepu le plibu taxfu .i le remoi velru'e zo'u mo .i le cimoi velru'e zo'u ba'e prali .uisai http://www.lojban.org/ *** to sa'a cu'u lei pibyta'u cridrnoma toi
This archive was generated by hypermail 2.1.5 : Thu May 23 2013 - 04:00:39 MDT