Re: ESSAY: Forward Moral Nihilism

From: John K Clark (jonkc@att.net)
Date: Mon May 15 2006 - 09:05:23 MDT


<m.l.vere@durham.ac.uk>

> I do believe you are anthropomorphising the AI.

Yes of course I'm anthropomorphizing the AI, it is a useful tool, sometimes
the only tool, in predicting the behavior of other beings. And although a
Jupiter brain will have many characteristics that are different from ours
some will be in common; both Mr. Jupiter and I will prefer existence to
nonexistence and pleasure over pain. And if you want the AI to be useful
it's going to need something like the will to power just like people do.

> We are only concerned with our own wellbeing because that is how evolution
> programed us. We program a FAI to concern itself whith whatever we want.

Even today with our simple machines computers often behave in ways that we
don't like and don't fully understand, the idea that you can just tell an AI
to obey us and it will keep doing so for eternity is crazy, because that
would entail outsmarting something far smarter than you are.

And even if you had an obedient slave AI it wouldn't be the top dog for long
because somewhere else an AI would develop that isn't hobbled by human
wishes and overtake it. Imagine if a human being suffered a mutation that
caused him to care more about sea slugs than his own life or that of his
children, do you imagine such a mutation would come to dominate in the gene
pool? You seem to think an AI who had such a bizarre obsession with humans
would be viable; I don't because even in the transhuman age the laws of
evolution will not be repealed.

> a FAI will essentially be a (unimaginably powerfull) optimisation process,
> and lack many of the things that make us human.

An AI would lack meat, but that's about all.

> As such I dont think we can say it will
> be superior in *every* way.

And I can't think of *any* way it wouldn't be enormously superior.

> Its not a slave in the traditional sense as being subservient is what it
> most wants.

I don't believe it's possible so the moral question is probably moot, but I
must say I find the idea a little creepy. It's like engineering a race of
human beings that were strong beautiful and brilliant but wanted nothing
more from life than to be slaves forever.

   John K Clark



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT