RE: Reactions to Bostrom's Simulation Essay

From: Lee Corbin (lcorbin@tsoft.com)
Date: Mon Mar 24 2003 - 02:21:52 MST


Robin posted his essay

> http://www.digitalkingdom.org/~rlpowell/rants/simulation_errors.html

Here are some of my reactions.

> I simply think that an intelligence with the capacity to keep the
> state of billions of other intelligences in its head (and collate
> and manage those states and the states of those intelligences'
> environments and so on and so on) would be many, many times more
> complicated than all of the intelligences it is managing combined.

This is certainly a very good point, and must affect anyone's
estimate of *when* such a simulation will be possible.

The most difficult point of the paper, and an admittedly difficult
point of Nick Bostrom's thesis, is attempting to gather some idea
of the amount of compute power necessary. I believe that Nick
said that one would have to simulate down to the level of atoms,
and you are saying that it must go beyond that.

I don't think so at all, but rather than make a detailed argument,
I'll just present an analogy. Apparently to the human eye an
entire scene before it is laid out with the same detail: out
of the corner of your eye, for example, it appears to you that
the world's information is being conveyed to you in the same
density as it is at the point you're directly staring. But
it's well known to students of vision that this is an illusion:
when a computer is used to badly blur every point of one's vision
except the very most central portion one is looking at at a certain
instant, one is not conscious of any difference.

In just the same way, suppose that you *are* living in a simulation.
Just on what are you basing your intuition concerning the accuracy of
the simulation so far? Only on your memories, and they're obviously
under the control of this hyper-intelligent program. From this and
similar arguments it follows that the actual compute power necessary
to render your current experience has been greatly exaggerated.

Moral Issues
------------

In a not unrelated way, the same thing applies to one's notions of
exactly how much pain was caused by the Mongol invasion of China,
because when it really comes down to it, you can testify about no
one's pain but your own. Suppose that someone you love, however,
is badly mangled in an automobile accident, and apparently undergoes
great suffering. Note that at any particular moment you are
actually communicating with this individual, most of the testimony
as to the person's suffering has to do with the individual's memories
of that suffering. Perhaps the only actual suffering that ever occurs
happens only at the precise moments when you are engaging certain
of this loved one's faculties, and that at other times, only the
*memories* of suffering are being added.

(This is part of the larger point that heretofore in human history,
memory accumulation and experience have been inseparable. But in
the near future, as was depicted in the movie Total Recall, this
need no longer be the case. One can have memories of events
without actually undergoing the corresponding experiences, and one
can have experiences without accumulating memories! I won't digress
here on the difference between portrayals and emulations, but the
moral issues, of course, concern only emulations. I.e., we must
assume that a number of real human beings actually experienced
charging into China on small ponies.)

Therefore not only is this* possibly a near pain-free emulation, but
I'm surprised that theologians haven't used a form of the above
argument to enable them to suppose that God did after all create
the best of all possible worlds, vindicating Dr. Pangloss.

Lee Corbin

*The tricky word "this" is a multiple valued pointer, refering
both to simulations and to originals in the multiverse; here I
mean to distinguish, perhaps with an outside observer's help,
one particular simulation.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT