Re: Beyond evolution

From: Samantha Atkins (samantha@objectent.com)
Date: Sun Feb 04 2001 - 21:03:14 MST


"Eliezer S. Yudkowsky" wrote:
>

> Advice - freely offered, freely rejected.

What does it mean to reject the advice of a Being that controls all the
material local universe to a very fine level and will not allow
disagreement that leads to possible actions that it decides are possibly
harmful to the sentiences in its care? Where is the freedom? I see
freedom to disagree but not to fully act on one's disagreement?

>
> Build another SI of equal intelligence - sure, as long as you build ver
> inside the Sysop.
>

What for? That would rather defeat the purpose of having more than one
local Entity of such power. A single entitity is a single point of
failure of Friendliness and a great danger.

> Build an Ultraweapon of Megadeath and Destruction so you can see how it
> works - sure, as long as there's a bit of Sysop somewhere inside the
> trigger making sure you don't point it at the Amish communities on Old
> Earth.

Building ultra-weapons is not at all the point or anything I intend.

>
> Build an Ultraweapon that you can aim anywhere, with no Sysopmatter
> (visible or not) anywhere near it - you might still be able to get away
> with this, as long as the Sysop can predict the future with total
> certainty and predict that you'll never abuse the Ultraweapon, regardless
> of any external influences you encounter. Probably no human, even Gandhi,
> is subject to this prediction, but an uploaded Gandhi turned transhuman
> might be.
>

Again, the Sysop abrogates all decisions and all wisdom to itself. How
about upgrading its uploads to their own ever-increasing wisdom. Why
have Cosmic Mom for all time?

>
> Under absolutely none of these circumstances does the Sysop need to strike
> back at you. Ve just gives you an API error.
>

Err. Already assumes precisely my point. This being is effectively
God. You exist only within it and as it allows. Are you really willing
to take on the building of such? Are you so convinced it is the Only
Answer?

> > What if onery sentiences simply do not want to have to pass all
> > decisions through this Sysop, no matter how intelligent and benign it
> > may be? This is not an unexpected situation. What will the Sysop do in
> > those cases? What if some group of sentients decided that what the
> > Sysop considered an unacceptable risk was perfectly acceptable to them?
> > Why would the Sysop want to forbid all entities that disagreed from
> > going somewhere outside its territory? Can't stand the possibility of
> > competition or that something might not be under its metaphorical
> > thumb?
>
> For all I know, it's entirely okay to fork off and run under your own
> Sysop as long as that Sysop is also Friendly. (People who chime in about
> how this would dump us into a Darwinian regime may take this as an
> argument against Sysop splitting.) The static uploads may even form their
> own polises with different operating systems and rules, with the
> underlying Sysop merely acting to ensure that no citizen can be trapped
> inside a polis.
>

But this Sysop can't be built by your earlier response except totally
within the Sysop so in no real sense is it independent. I am concerned
by the phrase "static uploads". Do you mean by this that uploads cannot
grow indefinitely in capability? If so, then why on Earth (or beyond
it) would I or any other trans/post-human agree to such?

> > How is it good for humans, being just the type of onery
> > independent creatures that we are, to have a benign Sysop rule over us?
>
> This brings up a point I keep on trying to make, which is that the Sysop
> is not a ruler; the Sysop is an operating system. The Sysop may not even
> have a public personality as such; our compounded "wishes about wishes"
> may form an independent operating system and API that differs from citizen
> to citizen, ranging from genie interfaces with a personality, to an Eganic
> "exoself", to transhumans that simply dispense with the appearance of an
> interface and integrate their abilities into themselves, like motor
> functions. The fact that there's a Sysop underneath it all changes
> nothing; it just means that your interface (a) can exhibit arbitrarily
> high levels of intelligence and (b) will return some kind of error if you
> try to harm another citizen.
>

Let's see. The SysOp is a super-intelligence. Therefore it has its own
agenda and interests. It controls all aspects of material reality and
all virtual ones that we have access to. This is a good deal more than
just an operating system. What precisely constitutes harm of another
citizen to the Sysop? For entities in a VR who are playing with
designer universes of simulated beings they experience from inside, is
it really harm that in this universe these simulated beings maim and
kill one another? In other words, does the SysOp prevent real harm or
all appearance of harm? What is and isn't real needs answering also,
obviously.

> Things might be different in the transhuman spaces - I can guess for
> static uploads only. And the above scenario may not be true for everyone,
> but it is certainly much more likely to be true for people who resent the
> "rule" of the Sysop.
>

Again, what does *static* mean in the above? I don't really parse what
you had in mind with the rest of this paragraph.

> > It strongly goes against the grain of the species. How will the Sysop
> > deal with the likely mass revolt? What will "Friendliness" dictate?
> > Simply wait it out as it holds all the cards?
>
> Yep. Again, for static uploads, the Sysop won't *necessarily* be a
> dominant feature of reality, or even a noticeable one. For sysophobic
> statics, the complexity of the future would be embedded entirely in social
> interactions and so on.
>

If it is present at all it will be noticeable except for those who
purposefully choose to design a local space where they do not see it.

> > Will the Sysop be sure
> > this is actually being "Friendly" to the type of creatures we are?
>
> If it's what we say we want.
>
> > Are you sure?
>
> Of course not. You could be right and I could be wrong, in which case -
> if I've built well - the Sysop will do something else, or the seed AI will
> do something other than become Sysop.
>

OK. If it is not the Sysop what are some of the alternate scenarios
that you could see occurring that are desirable outcomes?

> >
> > What if I simply want an extended vacation from Sysop controlled space?
> > From what you have said, if I decide I want to extend that permanently
> > the Sysop will say no. Interesting. Do you honestly think humanity
> > will put up with this? Do you honestly think it is ok to effectively
> > force them to by leaving no alternative?
>
> Yes. I think that, if the annoyance resulting from pervasive forbiddance
> is a necessary subgoal of ruling out the space of possibilities in which
> citizenship rights are violated, then it's an acceptable tradeoff.
>

If the citizens have no choice then there is no morality. There is only
that which works by the Sysop's rules and that which does not. In such
a universe I see little impetus for the citizens to evolve.

> Please note that in your scenario, people are not all free free free as a
> bird. In your scenario, you can take an extended vacation from Sysop
> space, manufacture a million helpless sentients, and then refuse to let
> *them* out of Samantha space. You can take actions that would make them
> *desperate* to leave Samantha space and they still won't be able to go,
> because the Sysop that would ensure those rights has gone away to give you
> a little personal space. I daresay that in terms of the total integral
> over all sentients and their emotions, the Samantha scenario involves many
> many more sentients feeling much more intense desire to escape control.
>

The Sysop is refusing to let me out of Sysop space. Truthfully we have
no idea how various sentiences will react to being in Sysop space no
matter how benign you think it is. Your hypothetical space where I
torture sentients is an utter strawman.

- samantha



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT