RE: Coercive Transhuman Memes and Exponential Cults

From: Ben Houston (ben@exocortex.org)
Date: Mon Jun 11 2001 - 23:44:43 MDT


>Re what I wrote a week or two ago, unless we are quite clear on what
>benevolent means, we must mean altruistic (I realize that CFAI uses
>benevolent in a few places) in the Friendly sense.

BTW pure altruistic behavior is un-stable in an evolutionary sense but
maybe that isn't really relevant to this discussion.

>To make a
>Unix analogy (because that's what I know best :-)), the Sysop is not
>root, but more like the permission system, only much more complex.
>It can advise aginst you accessing a file, even though you have
>access to it, but in the end it can't stop you, even if it is not in
>your interest to do as such, because it is your volition that is most
>important and the Sysop may not violate it and must respect it so
>long as it doesn't violate anyone else's volition.

Sounds like Freud's concept of the "superego" (i.e. the moral component
of his theory of mind) but more in a collective sense? Are you sure
that you want this vicarious selector to be implemented in an external
regulating super-system or would you want it internalized in all the
agents as an internal component. It would be cheaper and probably more
robust to internalize it... but then how would it differ from the
already existing moral center of our brains (i.e. parts of the
prefrontal cortex)? Maybe your Sysop idea is just an augmentation of
our existing moral system...

What is a vicarious selector you ask? Another concept from cybernetics
/ system theory...
  http://pespmc1.vub.ac.be/VICARSEL.html

Cheers,
-ben houston
http://www.exocortex.org/~ben

-----Original Message-----
From: owner-sl4@sysopmind.com [mailto:owner-sl4@sysopmind.com] On Behalf
Of Gordon Worley
Sent: Monday, June 11, 2001 11:59 PM
To: sl4@sysopmind.com
Subject: RE: Coercive Transhuman Memes and Exponential Cults

At 11:38 PM -0400 6/11/01, Ben Houston wrote:
>I am not sure that I understand it from that description those. It
>seems to be suggesting that one create an overarching benevolent
>controller that looks out for the well being of all those in its realm
>-- a benevolent god if you will. If that is the case I do not think
>such omnipotence is feasible for the same reason that it is very
>difficult to predict the stock market with accuracy -- it is a chaotic
>recursive system composed of learning entities. I suggest that anytime

>less than omnipotence will be ineffective in actually controlling the
>fates of all the individual actors -- there will always exist some that

>are being taken advantage of by others in either over or subtle ways.

Re what I wrote a week or two ago, unless we are quite clear on what
benevolent means, we must mean altruistic (I realize that CFAI uses
benevolent in a few places) in the Friendly sense.

Now, on what you've written. This is not really what the Sysop is.
I think that Sysop may not be the best way to think of it. To make a
Unix analogy (because that's what I know best :-)), the Sysop is not
root, but more like the permission system, only much more complex.
It can advise aginst you accessing a file, even though you have
access to it, but in the end it can't stop you, even if it is not in
your interest to do as such, because it is your volition that is most
important and the Sysop may not violate it and must respect it so
long as it doesn't violate anyone else's volition.

On a side note, I think the Sysop scenario page needs some work.
Maybe it's already been improved some in internal versions that
Eliezer may be working on, gearing up for 1.0, but I think that it's
vague and there is a lot more to it than is mentioned on that page.
Having read numerous posts about the Sysop, I now know what it is and
isn't, but I think if I just read the page, I wouldn't like it.
Actually, I'm still skeptical of the ability to impliment the Sysop
Scenario, but if we have a Transition Guide, it is more likely but
once we run into other civilizations we may find out that
Friendliness is the exception to a norm of dystopian Singularities.

-- 
Gordon Worley
http://www.rbisland.cx/
mailto:redbird@rbisland.cx
PGP Fingerprint:  C462 FA84 B811 3501 9010  20D2 6EF3 77F7 BBD3 B003


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT