Emulated Realities (was Re: Rationalizing Suffering)

From: Paul Fidika (Fidika@new.rr.com)
Date: Wed Apr 02 2003 - 12:23:41 MST

Lee Corbin wrote:
> Suppose it's twenty years in the future and I'm playing the newest total
> immersion version of this game. I'm booted out of the game once it's
> apparent that my character is about to die horribly but everyone else
> me die horribly. Is this morally wrong?
> ....
> Suppose that for the ultimate in realism and to truly "live the game",
> decided to accept the temporary blockage of all outside-game knowledge.
> Until I "die" inside the game, I won't remember/know about anything
> the game but once I "die", I will go on living my normal life outside the
> game. Is this morally wrong (Assume that we're advanced enough that there
> is no way in which in-game events can harm, much less traumatize, my
> outside-game self)?

I was amazed when I first read this, because I've had thoughts along these
lines in the past as well. Firstly, let's go to the uber-video-game universe
where you're completely immersed with your memories of your real-self
blocked out temporarily, and all you can remember is memories from the game
(we will probably see such games appear before the end of the century). If
your character dies in the game, but you are pulled out the instant before
death and experience no pain (or a minimal amount), but all the other
players believe that you are indeed dead and gone, then this is definitely
morally wrong, because presumably some of the other players will feel mental
anguish at your loss, which is suffering for them, even if their full
(non-blocked memory) selves consented to let themselves suffer in the game

But an even more disturbing idea is this; what about the AIs living within
the game? When they "die," they're not pulled out, because they have nowhere
to go, so perhaps they regenerate (reborn, perhaps) somewhere else. But this
still leaves the other players and AIs suffering in the loss of their
comrade, even if ve didn't experience any pain at the time of death.

What I'm getting at here is, at what point is it morally reprehensible to
kill an AI in a video game? Most people would just laugh this notion off,
but I'm serious here. People have spent a lot of time lately debating and
studying whether video games or movies corrupt us, our children, or whatever
crap Congressman so-and-so wishes to dredge up to whenever to pass the blame
whenever some kid somewhere goes berserk and shoots a few other kids /
teachers. But I don't believe I've heard anyone say it's immoral to kill AIs
in video games. If an AI has subjective experiences, such as pain when ve
dies or sorrow when vis partner dies, then I believe it would definitely be
wrong to kill / harm / control / manipulate ver. I'm not saying that any of
the AIs in today's video games are experiencing pain, since 1: they're
mostly all table-driven agents anyone (e.g., if player does X AI does Y),
and 2: even the most sophisticated AIs don't have nearly enough
hardware-power to have subjective experiences, and even if they do, to much
of it is being used up by just generating realistic graphics to allow any
really sophisticated AIs to exist. The AIs in video games today are
incredibly stupid, perhaps around the same intelligence level as a bacteria
or ant at best, but they are getting smarter. I've seen several games in
which the AIs are able to beg for help, limp around when wounded, or go
through very painful-looking death-throws when dyeing.

With really no one other than myself considering such possibilities, I could
very well see the whole Roman-arena thing happening in the future again.
I.e., we look back now and say "how could the Romans actually watch and
enjoy real people being torn to pieces before their very eyes?", but perhaps
in the future sentients will say about us "how could the homo sapiens
sapiens actually watch and enjoy real sentients being torn to pieces before
their very eyes?"

> Now, can anyone prove that we aren't living the last scenario?

Nope, I sure can't, although I'm even inclined to believe that perhaps we
are in some elaborate game, that would explain why some people seem so
stupid, and act very predictably and mechanically; even in the future they
still don't have good video game AIs... ;-p

If this is a game though, it's certainly a very cruel one. I've been quite
sick and in the past and it's certainly very painful, something which I
wouldn't exactly call "entertainment." Although it's possible that I really
never was sick, or the Mongol invasion of China or World War II never
happened, that they were all just "backstaging" set up by the programmers
(Gods) of the game, but that still doesn't mean there isn't suffering. If I
stab myself right now, I'm quite confident that I'll feel pain and suffer;
it's not just some memory implanted there by the God running this game. If
this is a game, why do we have to suffer? To make it seem "realistic"? Our
memories are blocked after all, so how would we know what is "realistic" or
not, other than from what we've learned in the game? For a sentient-created
intelligence-driven universe, suffering of the magnitude people feel (mental
and physical) is entirely unnecessary, and this, if for no other reason, is
a good argument as to why this is the "ultimate reality," (i.e., we're not
just in a game, sim, emulation, etc.) because no quasi-moral sentience would
put people through that.

If this really were a game, I doubt I'd come back and play it again after I
die and am "pulled out" of the game, or perhaps I am a game-AI, and will not
get the chance to be "pulled out"?


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT