Re: Singularity Fun Theory (was: Ethical basics)

From: Mitch Howe (
Date: Sat Jan 26 2002 - 02:16:32 MST

Eliezer Yudkowsky wrote:
>>Note that, by the same logic, it is possible to experience certain kinds
of fun in VR that might be thought impossible in a transhuman world; for
example, reliving episodes of (for the sake of argument) The X-Files in
which Scully (Mulder) gets to save the life of Mulder (Scully), even though
only the main character (you) is real and all other entities are simply
puppets of an assisting AI. The usual suggestion is to obliterate the
memories of it all being a simulation, but this begs the question of whether
"you" with your memories obliterated is the same entity for purposes of
informed consent - if Scully (you) is having an unpleasant moment, not
knowing it to be simulated, wouldn't the rules of individual volition take
over and bring her up out of the simulation?<<

I think this also begs the question of whether an obliterated memory would
make the experience any more entertaining anyway. I say this considering
the fact that humans have for decades been quite entertained by video games
that simulate artificial environments -- and this despite having intact
memories and mental hardware many orders of magnitude more powerful than the
system running the game. Controlling a Mulder puppet may be, to a
transhuman, the equivalent of playing Pong. . . but hey, some people really
enjoyed Pong.

One feature that almost universally enhances the fun factor of any kind of
game is an intelligent (even sentient) teammate or adversary. It is not
inconcievable that even a hyperintelligent, pandimensional being might enjoy
a game of chess if matched up against a comparable oponent.

I think we can trust that transhumans will be able to keep transhumans

--Mitch Howe

Do You Yahoo!?
Get your free address at

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT