Re: An essay I just wrote on the Singularity.

From: Robin Lee Powell (
Date: Sun Jan 04 2004 - 14:00:26 MST

On Mon, Jan 05, 2004 at 03:18:33AM +0900, Metaqualia wrote:
> > Sadly, no objective morality exists, so it isn't possible to
> > program the "Friendly" AI to be perfect in its "Friendliness".
> I found an objective morality. I have proposed it many times.
> If it didn't feel like anything to be inside a gas chamber, it
> wouldn't be immoral to gas people. If greeting somebody caused
> excruciating psychological pain, then saying hello would be
> immoral.
> so:
> What creates negative qualia in sentient beings is evil.
> What creates positive qualia in sentient beings is good.


Not killing Jews produces strong psychological pain in me. That
means that killing Jews is good, right?


Me:  ***   I'm a *male* Robin.
"Constant neocortex override is the only thing that stops us all
from running out and eating all the cookies."  -- Eliezer Yudkowsky             ***              .i cimo'o prali .ui

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:43 MDT