Re: A Sysop alternative

From: Gordon Worley (redbird@rbisland.cx)
Date: Mon Apr 09 2001 - 08:07:51 MDT


At 6:44 AM -0700 4/9/01, James Higgins wrote:
>Well, I can already see the 1st objection. Go back and look at
>Brian/Eliezer's arguments about not uploading a human first. If you
>need to have AIs & SIs to do this, how do you prevent them from just
>taking over? I'm a little less worried about this than some on the
>list, but it is an important question. The first AIs must have
>something like Friendliness, "more or less Friendly" just won't cut
>it.

I realized that this would be a problem. The best I can do is
suggest a sandbox to play in to figure this out. Of course, what if
the sandbox has bugs? Answer: we may be screwed.

>Second, I can't personally imagine an implementation to enforce
>morals that wouldn't require intelligence. Making these decisions
>requires the ability to reason. A good example of this is some of
>the latest Airbus planes. They have "enforced morals", as such, in
>that they prevent the pilot from taking the plane past certain
>angles they have preset. Now, 99.999% of the time that's great, but
>what if your about to hit a mountain? I'd rather be uncomfortable
>for a short time and live, than be comfortable right up to my grave.
>The plane can't reason, though, so in this scenario you'd be dead
>(there is no manual override, I understand, btw). This is the kind
>of thinking I want to avoid. One wrong moral, and it could
>artificially force the whole group into extinction.

This isn't really a good analogy. Real morals have nothing to do
with comfort, but with actual survival. You can be incredibly
uncomfotable and not violate morals. Also, this is why I am weary of
actually enforcing the morals. This is mostly me trying to find an
alternative to a Sysop if we have to have some kind of rules. I
agree, if we get it wrong just a little bit, were all screwed.

>I would also say that Intelligence breaking the "natural morals" is
>probably a good thing, on the whole. Our natural morals are such
>that we should run around kill & eat animals, and screwing any
>female we can to propagate the species. Well over 90% of the
>population has probably never killed anything larger than a mouse;
>Some would say this is a good thing. We have also created trade,
>computers, video games, written language, etc. These are not,
>strictly speaking, required by our morals. Morals really represent
>"common sense" most accurately. They are not, and should not be,
>hard and fast rules. Rather, they are guidelines.

This is a common misunderstand, relating morals too closely with
changes in society. I don't have the time right now to explain this
(I'm sort of in a hurry but wanted to reply).

>Having a non-intelligent system that enforces morals would be
>terribly dangerous for all inside if one or more external SIs
>existed. In such a situation the free SIs could do anything they
>wished to the SIs in the moral reality, who would be restrained in
>protecting themselves. Without a Sysop around to help, they would
>be at the mercy of the external SIs. Not that I personally love the
>Sysop scenario, but having a non-intelligent system is even worse.

Well, this is why I don't want any kind of Sysopish thing at all.
So, I guess we agree on the anarchy scenario then?

>I personally think we would probably be just fine individually
>uploaded. Well, as long as a large group of mostly reasonable
>people uploaded at the same time. If SI has a natural attractor for
>friendliness, everything is good. If it does not, things continue
>*mostly* as they are now. Except that everyone is incredibly
>intelligent, no one is handicapped (unless they want to be) and
>everyone has incredible power over the physical universe. At first
>this sounds scary, but most people will want to do the right thing.
>So if 1% of the population is inherently unfriendly, 99% of the
>population would be able to gang up (as it were) and protect against
>that 1%. Maybe putting them in a Sysop controlled virtual reality
>for eternity if they are really bad (ie: a very nice & cushy prison,
>where you can do anything except affect the "real" world).

I've written stuff to this effect on the list before, but glad to
have someone out there in agreement. Makes it easier to team up
against Eliezer. ;-)

-- 
Gordon Worley
http://www.rbisland.cx/
mailto:redbird@rbisland.cx
PGP Fingerprint:  C462 FA84 B811 3501 9010  20D2 6EF3 77F7 BBD3 B003


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT