From: Charles Hixson (charleshixsn@earthlink.net)
Date: Tue Dec 02 2008 - 11:11:49 MST
I wouldn't go for that one, as it justifies intervention without
request. Read Jack Williamson's "With Folded Hands". But I'll admit it
looks as if it would protect against the threat of extinction at the
behest of the AI.
I also wouldn't go for any version that allows people to command the AI
to go against it's own morality. (People do some of the strangest things!)
Still, I haven't gotten as far as an exact phrasing. I'm still working
on how I should define "people" to an AI before it becomes conscious.
My best effort so far is something like "that class of entities that
endeavors to put some of it's thoughts into words". At some stage of
it's life? Should I really exclude parrots and phonographs? I do so at
the cost of making the definition depend on non-observables. But not
doing so has obvious drawbacks. Then I'm going to need to translate
this into the motivational structure of the AI, which I'm still trying
to design.
Aaron Miller wrote:
> @Stuart, what about "Above efficiency and all other goals, put first
> the survival and overall health of the species /homo sapiens/ and its
> direct biological descendants [in the case of speciation]. Do not
> commit any action that may conflict with this goal."
>
> On Tue, Dec 2, 2008 at 3:38 AM, Stuart Armstrong
> <dragondreaming@googlemail.com <mailto:dragondreaming@googlemail.com>>
> wrote:
> >> Yes. This is why it would be silly to design an AI without a robust
> >> morality. I suspect that true friendliness is impossible, but it
> should be
> >> possible to achieve something better than "red in tooth and claw".
> Even
> >> natural evolution usually does better than that.
> >
> > When the power difference is small, maybe.
> >
> > But I'll take the bait. Give me a robust morality (spelled out as
> > clearly as possible) for an AI, and I'll give you a situation in which
> > that AI effectively terminates us.
> >
> > (AI's that do nothing, or just commit suicice, etc... excluded, of
> course :-)
> >
> > Stuart
> >
>
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT