From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Apr 24 2003 - 16:16:09 MDT
Thomas R Mazanec wrote:
>
> I have always seemed to know that if the universe is infinite, with an
> infinite number of stars, than there is an infinite number of planets
> duplicating the Earth to any desired degree of precision... down to an
> infinite number of "Tom Mazanecs" indistinguishable from myself on the
> atomic scale (I had some strange looks trying to point this out to
> people when I was in high school :-)). But I always thought that, while
> it is impossible to write fiction, it is possible to write fantasy.
> Thus, while an infinite number of dark worlds where Hitler won the war
> exist, there are none where Sauron won his war. Was I wrong? Is there
> really an Inuyasha and Tegome in some alternate Japan a google
> megaparsecs away? Somewhere out there, I know Robin Hood is actually
> fighting the Sheriff of Nottingham, but is Sonic fighting Dr. Robotnik?
> I am not trying to be sarcastic, I am trying to understand what this
> theory means for what can (and therefore must exist). An infinite
> number of Tom Sawyers and Sherlock Holmses of course, but an infinite
> number of Bugs Bunnys and Woody Woodpeckers? (not that such a thought
> would be emotionally unpleasant to me... if anything, just the
> opposite).
Suppose there exists an evolved sentient species much like our own - call
them the Corbinians - except that the social challenge of reciprocal
altruism and politics has driven their species in a direction different
from our own; rather than altruism, they have a species-universal drive to
honor contracts with any person with whom they are engaged in a
positive-sum game. They go through their Singularity and create an AI
that lies within *their* species' moral frame of reference, and the upshot
of this is - the hypothesis continues - that citizenship rights are
granted to all the members of that species, but *not* to simulations,
extraterrestrial species, et cetera - such nonpeople weren't in the
honor-bound group.
If so, the members of this species might then have the capability, the
mercilessness, and the emotional immaturity to create Bugs Bunnys and
Woody Woodpeckers as sentient, qualiabearing individuals.
At an even higher level of improbability we can suppose random
fluctuations assembling SIs with any physically realizable desirability
computation - including morals stating: "Simulate [a world exactly like
this one], except that on April 28th 2003, Eliezer Yudkowsky acquires
magical powers." In that sense, Tegmark's Level IV may not add much over
Tegmark's Level I, except for the mathematical possibility of arbitrarily
lengthy or infinite computations if the computations of this universe are
bounded in space or time. In both cases it seems like the ultimate
probability-measure of an event would be given by its Kolmogorov
complexity - the simpler it is, the more often it will arise within the
hierarchy of simulators simulating simulators, no matter which Level IV
universe you happen to be in. If "which Level IV universe" is even a
coherent question, given that any Level IV universe exists both on its own
and as a simulation within infinite other Level IV universes.
We should thus strive to ensure that nice things are predicted with the
maximum probability by the simplest possible explanations. For example,
physics giving rise to biology giving rise to intelligence giving rise to
a Friendly AI is a much simpler Kolmogorov explanation than random
fluctuations assembling a Friendly AI - the latter requires much more
arbitrary initial information. Another way of looking at this is that in
a given universe, much more Friendly AI happens as the result of
altruistic species evolving, rather than Friendly AIs being assembled from
random fluctuations, and this is likely to hold throughout the whole of
Level IV Reality.
We can probably assume that the relative measures of events in Reality are
dominated by what we would think of as "naturally occurring" events.
Everything happens somewhere, but almost all of those things happen with
exponentially less measure than the possibility of winning the lottery,
and should accordingly count for less in our calculations.
Let's divide the morally meaningful events into six tiers, each with
different measure:
In the first tier are the ancestral civilizations and the true citizens of
post-Singularity civilizations: either occupying the top tier of a Level
IV reality; occupying a strictly derived regularity in the top tier of a
Level IV reality; or occupying a universe which is being simulated in a
strictly hands-off fashion.
In the second tier are people trapped in SIs with alien moralities and
deprived of what we would regard as citizenship rights. It is possible
that for simple minds (in the rough range of human-sized), the measure of
ancestral civilizations within Corbinian SIs outweighs the measure of
ancestral civilizations in the first tier, because each Corbinian Otaku
could conceivably play with an entire ancestral civilization. Also in
tier two are ancestral civilizations simulated by thermostat SIs for
purposes of investigation into, e.g., the origins of Friendly SIs. Again,
the measure of such tier two ancestral civilizations may conceivably
outweigh those in tier one, especially if some classes of thermostat SIs
have nothing better to do with their time than investigate remote
possibilities leading to their own extermination, such as Friendly SIs
hunting them down for the crime of simulating ancestral civilizations.
In the third tier are relatively exotic and unlikely things for Corbinians
to simulate, such as Kagome and Inuyasha. Why is this exotic? Not
because of the fantasy element; if you presume a dramatic storyteller as
worldbuilder, the fantasy element doesn't add much Kolmogorov complexity.
But Kagome and Inuyasha are psychologically human. Why are *Corbinians*
simulating *humans*? They could, of course, but most of the simulated
people created by Corbinian Otaku will probably be simulated Corbinians.
Thus, all dramatically coherent human fictions about humans occupy the
third tier, because they would require, not Corbinians simulating
Corbinian fictions, but Corbinians simulating human fictions. It would
require Corbinians who are otaku with respect to the anime developed by a
simulated human civilization.
In the fourth tier are weird places that can nonetheless happen relatively
naturally - i.e., worlds that don't have compressible *dramatic* structure
like Middle Earth, but that require relatively little additional
Kolmogorov complexity to specify. The fourth tier may have greater
measure than the third tier, but almost any humanly imagined form of
ritual magic is likely to occupy the third tier rather than the fourth
tier, since "ritual magic" is likely to compress most easily to (happen
most frequently as the result of) Corbinian Otaku rather than some actual
set of physical rules. In other words, if you find a portal to Narnia in
your closet, you should probably guess it's a Corbinian Otaku, rather than
a consiliently arising portal.
In the fifth tier are "natural plus incompressible" events that happen due
to a significant amount - say, a few million bits - of random fluctuations
or arbitrary initial information being added to a Corbinian civilization
or other SI. It is literally impossible for us to imagine such worlds
because our ability to imagine them places them in the third or fourth tier.
In the sixth tier are incompressible events where the initial conditions
would have to occur due entirely to random fluctuations - worlds with
zillions of bits of incompressible information. Note that tier six
includes instances of *all* kinds of worlds found in other tiers, but with
infinitesimal measure relative to those other tiers. Observers finding
themselves in strange places should not bet they live in tier six unless
there is no possible explanation from a lower tier (if that concept is
even comprehensible to observers finding themselves in a tier six world).
I would expect that most of the measure of meaningful events lives in
Tier1, Tier2, and Tier3. With any luck unmerciful species AIs will be
rare, giving Tier2 and Tier3 fantasy worlds relatively little measure.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT