Re: Ancestral simulations and happy puppeteers

From: Alden Jurling (nakomus@cnsp.com)
Date: Mon Dec 03 2001 - 23:43:07 MST


In the puppet show the puppets have no personal identity or free will, they
are effectivly an extension of the puppeteer. The immorality starts when you
exchange the puppets/sprites/scripts/animations for self aware minds, if the
simulated minds are in fact *real* minds rather then the puppeteer
roleplaying then they would be entitled to the same rights as any other self
aware mind.
Most people would not object to a puppet show. Many people would and do
object to tormenting animals purely for entertainment. And nearly everyone
objects to slavery.

----- Original Message -----
From: Emil Gilliam <emil@emilgilliam.com>
To: <sl4@sysopmind.com>
Sent: Monday, December 03, 2001 10:18 PM
Subject: Ancestral simulations and happy puppeteers

>
> Eliezer claims:
>
> > A Friendly SI outcome does not allow for nonconsensual
> > simulations, and most ancestral simulations would presumably fall into
> > that category.
>
> I find it difficult to believe that nonconsensual simulations should
> not be allowed, ever ever ever.
>
> Suppose Bob is a puppeteer, known for his masterful manipulation of
> wooden marionettes representing human characters. In one sad skit we
> see a character to whom very unfortunate things happen, and we see
> its expressions of suffering. Bob feels *happy* doing this dramatical
> production. He himself does not feel the character's pain (except in
> the metaphorical sense). If all we saw was the puppet we would
> conclude that its suffering is nonconsensual. Is Bob's action immoral?
>
> And of course, you see where this is going. Suppose we have six
> billion happy puppeteers controlling six billion "suffering" human
> puppets in a simulation, including some who have the line "I hope I'm
> not in a simulation; if I were, I'd want out of it now!" (Don't
> nominate this drama for a Pulitzer.) Is this immoral?
>
> The possible resolutions:
>
> 1.) What Bob does is immoral and should not be allowed. I truly fear
> for our freedom if a Friendly AI decides that a street puppeteer
> Pedestrian can't even put on a innocuous tragic puppet show (or, for
> that matter, that nobody can perform Shakespeare).
>
> 2.) What Bob does is okay, but the six billion isn't allowed. How
> many happy puppeteers would have to get together before their
> sorrowful simulation isn't allowed? On what rational basis could a
> Friendly AI pick such a number without it being somewhat arbitrary?
> There is no such basis.
>
> 3.) What Bob does is okay, and what six billion puppeteers do is
> (generally) okay. Then I think one of the following would be true:
>
> 3a.) All simulations of civilizations, including ancestral ones, are
okay, or
>
> 3b.) There might be something special about ancestral simulations, as
opposed
> a simulation of a civilization that never historically existed.
> To do a true ancestral simulation would require getting information
that is
> (as far as physics knows right now) irretrievably lost to history.
> Either this
> remains impossible forever, in which case it is a moot point, or if
> post-Singularity we discover some physics that allows this information
to be
> retrieved, the very act of retrieving it might involve doing something
else
> that is definitely immoral. But note that the immorality would then lie
in
> that (currently unknown) "something else", and not in the act of doing
a
> simulation.
>
> - Emil Gilliam



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT