From: Durant Schoon (durant@ilm.com)
Date: Sat Mar 17 2001 - 20:18:22 MST
----------------------------------------------------------------------
This post is inspired from Samantha's and Eliezer's comments in the
"Beyond Evolution" thread applied to the "How To Live In A Simulation"
thread.
I haven't finished reading the interim version of FAI, so if I'm
addressing something already covered there, please just let me know.
----------------------------------------------------------------------
"Sentients As Temporary Variables"
OR
"Memory Leaks in the Caverns of Computronium"
Is the possible future of Friendliness incompatible with the possible
future in which we are all resimulated?
The two scenarios of Friendliness and Resimulation might be
incompatible due to restrictions imposed by the Sysop. (The examples
in this posting assume one can "create an instance of a sentient" some
day in the future.) Up until recently, I had thought that these two
scenarios would both eventually take place, but now I'm wondering if
they are actually mutually exclusive.
Part I: Actions forbidden by the Sysop
Say I create two sentients: ABoy and HisDog, who are very good friends
and are emotionally attatched in a positive relationship. Suppose I
create them with simulational complexity equivalent to a modern ten
year old human boy and a 3 year old dog.
This is my simulated world, however, the following are not allowed
because these simulation are sufficiently complex and these sentients
have rights of their own:
1) I cannot destroy either ABoy or HisDog (nor torture them, maim
them, etc.)
2) I cannot lie to them and completely manipulate their realities to
make them believe that the other has died horribly or even that the
one hates the other and wants to be left alone for the rest of
eternity (which would cause a high degree of mental anguish for
either of them, though no one had actually been harmed). Presumably
they have the right to ask and learn the truth from the Sysop.
3) I cannot separate them and prevent them from meeting up again of
their own free will, so that they suffer in the absence of the
other (to fulfill my own sadistic desires).
4) I am required to inform them that they were created in a simulation
and that they have rights enforced by the Sysop (virtual Miranda?)
Memory Leaks: If every time I create a sentience, I cannot later
destroy vim since ve has rights, won't huge memory leaks result? I
assume there will be a bill of sentient rights, so that each sentient
created is given a minimum of computational resources to be happy
(ie. you cannot overcrowd them by placing them in a tiny environment
with no room (resources) to pursue happiness). All simulators will
have the following label: "We Serve only Free Range Sentients".
(Well, maybe this would be ok, if sentients agreed to assist in
whichever endeavors sentients are needed. For example, if we created
non-killable humans in the 1400's to build merchant ships for Europe,
eventually they could be trained to use Emacs in 500 years (well most
of emacs, 500 years might not be enough time :-))
In fact, I wonder: Would the Sysop allow me to give ABoy great wealth
(resources) and then take it all away (except for the minimum
requirements) for the sole effect of making ABoy miserable?
This means that I cannot create any simulation which qualifies as
sentient without certain consequences and responsibilities. Of course,
maybe I could create a sentient being with preloaded memories that
would, in all likelihood, lead vim to commit suicide in two days
(after filling out my very important marketting questionaire, of
course). Or can I actually do that? Maybe not...
Can I simulation the final days of my favorite suicidal pop star?
Part II: Is simulated suffering the same as real suffering?
TRUE STORY 1: My mother's mother died of cancer when my mom was 13
years old. My mother was given the responsibility of informing her
younger sister and brother. According to her, this whole ordeal was an
extremely onerous and psycholocally damaging event in her life. My mom
experienced real suffering.
TRUE STORY 2: As an adult teacher of elemetary school children my
mother was given the task of informing a student of hers that his
father had died that morning. The child burst into tears and my mom
felt terrible. When the child's mother came to school, she too did not
know and my mom had to tell her. Again more tears. However, as it
turned out, the whole thing was a mistake (not my mom's). The person
who died had a similar name, but was not actually the father of the
student. Whomever had contacted the school had made an error. The
student, his mother and my mother had suffered but the cause of the
suffering was illusory. Of course, until they knew the truth, this
suffering felt as "real" as real suffering and is arguably
indistinguishable (while it lasted).
Part III: Incompatibility of Friendliness and Resimulation
If what we call reality, is really a resimulation under the guidlines
of a future Friendliness, would the sysop allow us all be subjected to
the anguish and the horrors of modern times? (if not experiencing
horror first hand, being subjected to it very close up). I have
memories of being eight years old and being severely and traumatically
burned (1st to 3rd degree over 20% of my body) when my apartment
building burned down (you wouldn't be able to tell by looking at me
today, though).
If Eliezer's future of Friendliness takes hold, can we conclude that
this reality cannot be a simulated one, for it would not be allowed
(according to 1-4 above and probably a lot more out of FAI) due to the
large sorrow factor of our existence?
Part IV: Possibly Compatible:
I can provide some possible counter-examples, I just don't know if
it's easy to believe any of them.
How they might be compatible:
1) Simulated suffering does not count as suffering and is not
prohibited.
ie. We could be simulations, so all the Hutu's who were brutally
bludgeoned to death by machete weilding Tutsi's didn't "really"
experience actual violence.
2) The ones who are hurt are not actually sentient (just "background
characters") - This line of reasoning would be a terrible
justification for violence if it turns out to be wrong.
3) Implanting memories of suffering (and temporarily blocking all
other memories) is not prohibited.
ie. We could all be future sentients, reliving "old times" and
temporarily blocking all our "other" memories, just to make this
feel more real. We might have agreed to do this for the
experience, just like going to a movie and forgetting our "real"
lives for an hour and a half.
4) Assuming that this reality is a historically acurate recreation of
the true, original reality that lead to the Singularity, perhaps
the following rationale is valid: Since everyone lived through the
experience once and the end result was "positive", all the
"negative" experiences are justified as a matter of historical
relevance and for the greater good of perpetuating our reality (in
simulation).
5) It is allowable for sentients to experience suffering as long as
they are told the truth later and are given the option of reverting
to a former "non-pychologically scarred" version of verself.
Did I miss any?
-- Durant
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT