RE: Humane-ness

From: Ben Goertzel (ben@goertzel.org)
Date: Tue Feb 17 2004 - 11:59:24 MST


Robin,

I didn't really say anything about *mathematics*, I was using the term
"well-defined" in a more general sense.

Some concepts are intrinsically poorly-defined regardless of whether they're
formalized or not...

I'm afraid that the concept of "humane" -- as Eliezer defines it in that
quote, roughly "the essential consensus core of how humans would wish they
would act if they thought and studied about it a lot" [my paraphrase, my
words, not his] -- is badly-defined, in the sense that

1) there may be almost nothing in this consensus core, because of the
diversity of human values
2) the things in there may be really dumb things like belief in God, belief
in the inferiority of children, etc.

This kind of badly-definedness has nothing to do with degree of
formalization or otherwise

Ben G

> -----Original Message-----
> From: owner-sl4@sl4.org [mailto:owner-sl4@sl4.org]On Behalf Of Robin Lee
> Powell
> Sent: Tuesday, February 17, 2004 1:40 PM
> To: sl4@sl4.org
> Subject: Re: Humane-ness
>
>
> On Tue, Feb 17, 2004 at 12:52:31PM -0500, Ben Goertzel wrote:
> > I am not sure that humane-ness, in the sense that you propose, is
> > a well-defined concept.
>
> (NB: Ben, this is not an attack at you; I happen to be picking on
> you, but that's just random chance).
>
> I'm fairly consistently annoyed that people worry about the
> mathematical definitions of moral concepts with respect to
> super-intelligent AIs. That just seems bizarre. Why would an AI of
> even human-equal intelligence need every moral issue to be
> mathematically tenable? Most humans think such arguments are crap;
> why wouldn't an AI?
>
> People who are comfortable with the Sysop Scenario are scared that
> AIs will be too stupid to understand fuzzy arguments. These are
> untenable positions to hold simultaneously, as far as I can tell.
>
> -Robin
>
> --
> Me: http://www.digitalkingdom.org/~rlpowell/ *** I'm a *male* Robin.
> "Constant neocortex override is the only thing that stops us all
> from running out and eating all the cookies." -- Eliezer Yudkowsky
> http://www.lojban.org/ *** .i cimo'o prali .ui
>



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:45 MDT