From: Lee Corbin (lcorbin@tsoft.com)
Date: Mon Apr 28 2003 - 19:06:25 MDT
Paul writes
> --- Psy-Kosh <psykosh@earthlink.net> wrote:
>
> > According to Corbin's position, (as I
> > understand it) if you were an AI created
> > by him, then he has the right
Arghh! The word!
> > to do whatever he wants to you. (ie, say...
> > construct a hell world or something and
> > force you to exist in it while he laughs
> > maniacally...
>
> Right? By vis own right I suppose, but since ve is
> obeying no rules but vis own,
Show me someone obeying rules *not his own*, i.e., that
he does not sanction in some sense, and I'll show you a
slave. (Legally enacted laws are another matter, of
course---so my remark applies to this context, which is
moral rules.)
I want to make it abundantly clear that I steadfastly
disapprove of anyone being tortured. As I've mentioned
to Eliezer many times before, I just don't want the
storm troopers breaking my door down to tell me what
I can and cannot do with my own animals or my own
simulations. Private property has got us this far,
and this is no time to abandon the principle.
> then I also see no reason
> for other SI's to invade vir and force vir to do
> something else, or most preferably for the simulated
> AI to outwit vir and put vir in vis own simulation.
> Heck, why we are at it, why not throw out any absolute
> ethical system entirely! Ethical Relativism rules
> baby! (tongue party in cheek). :-)
I contend that if one speaks carefully and restricts
oneself to facts, then one may announce what he does
or does not approve of, or what he thinks will or won't
enable a society to prosper. Going beyond that, by
invoking "rights" in the abstract, or by claiming some
ethical system to be true, violates the is/ought
boundary.
Lee
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT