From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Aug 13 2001 - 17:38:01 MDT
Durant Schoon wrote:
>
> From: Brian Atkins <brian@posthuman.com>
> >
> > I have patiently explained that if the Citizens tell the Sysop to go away,
> > it will. No different than a standard revolution, except it's a lot easier.
>
> This seems weird, kinda. If everyone said "Sysop, leave us. We want to be
> able to abuse our children and one another for ages to come" and the Sysop
> complies...I'm beginning to see how we could all be in a simulation again...
> as unlikely as this sounds...
I'm not sure I agree with the phraseology that Brian used, because I think
there are individual rights that are not subject to majority override. I
do think that if everyone just passionately hated the Sysop's guts, such
that the expected suffering of abused children would actually be
outweighed by the suffering and existential angst of Sysop-haters, even
taking into account the volitional/nonvolitional distinction, then the
Sysop would probably pack up and go. I just don't think this is very
plausible. It requires (a) that abuse be expected to be very uncommon,
(b) that the numbers abused be very few (tending toward individuals rather
than whole simulated planets), (c) that the expected abuse not be too
extreme, (d) that defense beat offense at the limit of technology, and (e)
that the number of people who can't stand Sysops be very large. The
possibility of even a single person winding up trapped in Christian Hell
easily outweighs the hatred of thousands or the annoyance of millions...
though it may not outweigh humanity's potential, if humanity's potential
is truly threatened.
There are also whole scenarios in which the Sysop packs up and leaves, or,
far more likely, never happens in the first place, because it turns out
that there are valid reasons not to implement a Sysop Scenario. An
example would be the classic "A single entity is corruptible"
anthropoargument. All the scenarios I've heard are IMO rather farfetched,
but that doesn't rule out there being something I haven't heard yet.
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT