From: Brian Atkins (brian@posthuman.com)
Date: Fri Mar 23 2001 - 16:36:59 MST
"Christian L." wrote:
>
> > >
> > > Brian wrote:
> > > >Is it possible to have other scenarios where the sysop does not infect
> > > >all the mass in the solar system, while still ending all evil? I think
> >it
> > > >could be done through heavy surveillance, including both real and
> >virtual
> > > >realities.
>
> <snip>
>
> >Well I think in the after-Singularity era you have to turn such questions
> >around.. you have to ask, if we can do something like end all evil, why not
> >do it?
>
> What is this Evil anyway?
>
> Pornography? Child pornography? Cheating on your wife/husband? Abortion?
> Euthanasia? Drug trafficking? Selling drugs to kids? Bullying? Name-calling?
> Racist remarks? Nazism? Communism? Capitalism? Fascism? Satanism?
> Discrimination on basis of race, gender or sexual preference? Child abuse?
> Rape? Date-rape? Having sex with a minor? Pedophilia? Necrophilia?
> Prostitution? Child labor? Screwing up the environment? Genetic engineering?
> Cloning?
>
> It should be clear that the word "evil" is a very sweeping generalization
> that has different meaning to different people (unless the people on this
> list have a definition of evil that is objective and free from
> contradiction, in which case I apologize). To say that "evil" is sure to
> disappear in the future is therefore a rather silly statement in my opinion.
> One ought to be careful about making such statements, in order not to lose
> one's objectivity. The danger is that the discussion will deteriorate into
> quasi-religious mantras:
>
It is pretty well understood by people who have been around this for a while
what we mean when we say "end all evil". Here are some selected quotes from
the past (I'm sure some of this would be reworded since the whole Friendly
paper got written), specifically a little explosive debate that popped up
on the FoRK mailing list last August. Make sure you read the last little bit
since it's the most important.
No one is saying this is "for sure" how it will happen since we can't know
that. This is just the mostly likely scenario we've been able to come up
with using our unenhanced minds. It is written to give other humans some
kind of idea what we are working towards.
-- "I'm not interested in implementing Eliezer's Rules or Brian's Rules or Jeff Bone's Rules; I'm interested in implementing The Rules. I don't think there's a whole lot of slack in the Sysop Instructions; not with respect to the basics, anyway. Does the Solar System get divided up evenly among the six billion inhabitants? Yes. Can you do whatever the heck you want with your piece, as long as it (a) doesn't harm anyone else without that being's permission and (b) doesn't let you construct a Sysop-threatening weapon? Yes. Can you harm, or even affect, another entity without vis permission? No. Do the Amish get to keep farming? Yes." -- "> And why can't I construct a Sysop-threatening weapon? That conflicts with my > libertarian values. "Right to Keep and Bear Arms." Fine, construct a private reality and you can blow stuff up all you like. I have a right to keep and bear arms in a modern society, because in an unguarded world I can get shot by someone I couldn't shoot first. In the Sysop world, nobody can shoot me without my permission... and the reason is that the Sysop won't let anyone construct a Sysop-threatening weapon. Anything you want to do - *anything* at all - that doesn't harm another human, you can do through the Sysop API. The Sysop is not corruptible and has no temptation to meddle; the API to external reality would be *invisible* unless you specifically concentrated on seeing it. Until you tried to fire your gun at some poor guy who didn't volunteer for it, when the gun would suddenly stop working. Isn't that how you *want* reality to work? I don't even need to put any of that explicitly in the Sysop Instructions. It follows logically from the goal of maximum individual freedom plus not letting anyone shoot me without my permission. > Perhaps more pertinently, given > that such a weapon would likely be bits, doesn't the latter infringe free speech? No. How could you possibly harm a Sysop with bits? Hurt its feelings? > Hey, > I never agreed to that before you elected yourself God. Yawn. > > Yes. Can you harm, or even affect, another > > entity without vis permission? No. > > That sounds reasonable. > > > Do the Amish get to keep farming? > > Yes. > > > > I'd have to rate all of that a big honkin' Duh. So what exactly is left > > to debate? > > Well, see above, clearly there are big gaping holes of consensus. As long as you keep thinking it in terms of a set of Gestapolike instructions, you will keep finding points of dispute. Think of it in terms of good and evil and dictatorial powers, all of which you claim to despise, and of course you'll despise the result. Think of it in terms of finding a set of underlying rules that guarantee *everyone's* autonomy, and there is a single, forced solution. Maybe *you* find it natural to assume that you would abuse your position as programmer to give yourself Godlike powers, and that you would abuse your Godlike powers to dictate everyone's private lives. *I* see no reason to invade the sanctity of your process, and have absolutely no interest in enforcing any sort of sexual or political or religious morality. I have no interest in sexual, political, or religious morality, period. And if I did try to invade your process, the Sysop wouldn't let me. And if I tried to build a Sysop that could be dictated to by individuals, I would be building a gun to point at my own head. All that matters is the underlying process permissions that ensure individual freedom. I'm in this to significantly reduce the amount of evil in the world; I think that will follow naturally from giving everyone absolute individual freedom. I am not *allowed* to try and dictate reality directly. (Who doesn't allow it? Me.) > But, thanks, you've > absolutely proven that you don't have enough objectivity, or even ability or willingness > to consider other points of view, for the God gig. Next! ;-) Yawn. > > Hand it over to the UN, though, and I *guarantee* they'll screw it up. > > Absolutely, unequivocally, no doubt. Fine. The UN isn't allowed to do it. The trained professionals aren't allowed to do it. Who's gonna do it? You?" -- "> > The scary thing about that is, who gets to define what constitutes "evil?" Someone has to do it. > One man's > definition of "universal good" is another man's "tyranny." Pick your hot > button, say, elective euthanasia. What if the "Creators of the Universe" > a.k.a. Brian and Eliezer (a) don't come down on your side of the issue, and > (b) don't believe in unlimited free will? Let's say, for whatever reason, you > just want out. You're in a lot of pain. Now you're trapped in a world where > life is eternal and you just aren't allowed to opt out. "To eliminate all INVOLUNTARY pain, death, coercion, and stupidity from the Universe." Any problems?" -- Brian Atkins Director, Singularity Institute for Artificial Intelligence http://www.intelligence.org/
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT