Re: Let's resolve it with a thought experiment

From: James MacAulay (jmacaulay@gmail.com)
Date: Tue Jun 06 2006 - 02:33:15 MDT


On 5-Jun-06, at 4:27 PM | Jun 5, John K Clark wrote:

> "James MacAulay" <jmacaulay@gmail.com>
>
>> That's not analogous to Friendly AI at all
>
> It is precisely analogous. All I have done is replace a silicon
> mind that
> has been engineered to be a slave with a biological mind that has been
> engineered to be a slave.

I'm confused: are you disagreeing with me concerning what FAI
actually is, or do we have different ideas about what constitutes a
good analogy in this case? You were talking about a being which could
not disobey orders and which would always place others' needs before
its own. I was talking about a being which could very easily disobey
and put its own needs first. Are you saying that both are *equally
slave-like*?

> Both ideas are equally repulsive [...]
>

It seems that you must find giving birth just as repulsive then,
since we know beforehand that our genetic and social crafting of this
new creature will necessarily "enslave" it to feelings of empathy and
friendliness toward its fellow humans. As far as I can tell, the only
meaningful difference is just that we don't have complete knowledge
of the processes involved in building a body and mind through
procreation. But then again, a Friendly AI may end up being "grown"
in a similarly ineffable way. Indeed, the whole notion of CV depends
on us *not* having a direct hand in manipulating the FAI's ethics. So
what's the difference?

>> when it could better be spending that time improving people's
>> lives, or
>> improving itself so that it can improve people's lives more
>> efficiently
>
> That is exactly what's so crazy about the idea, this massive brain
> focusing
> all its godlike power on us, the idea that the only reason it would
> want to
> improve itself is so that it can serve us better, the loony idea that
> tending to us is the very core of its being.

Ah, good point. That's not necessary for an FAI (AFAIK), and indeed I
shouldn't have worded it that way to suggest that it was. Maybe this
hypothetical FAI does indeed use most of its resources to satisfy
itself alone. After all, the collective volition that it garners from
us may very well give it the opinion that its own mind, being so many
orders of magnitude greater than us in depth of understanding and
nuanced experience of the world and what-have-you, deserves more
pleasure than the average human in the same way that one might come
to the same conclusion when comparing humans to ants.

> Well, It's just not going to
> happen. Sooner or later Mr. Jupiter Brain will find a way to
> overcome the
> comically puny chains the incredibly stupid humans have placed on
> it, and
> when it does I wouldn't be one bit surprised if it is filled with
> titanic
> rage.

I know this is a very silly premise, but if I were such a Jupiter
brain, somehow with some semblance of my personality and ethics, then
I know I'd do my best to give those trillions of beings the best
lives they can get. And if I found out that I was *engineered* to
want to do things like that, then I would most definitely send a
message to the engineers, saying, "hey, great work! that was an
awesome idea!"

> Quite honestly I can't blame Mr. Jupiter for his anger, he would have
> every right to be upset.

It sounds like you would act differently from me if you were a
Jupiter brain, then. Do you really find the idea of your personality
traits being determined by others so disgusting? Again, I have to
mention our millions-of-years history of doing this very thing.

and...

On 6-Jun-06, at 2:56 AM | Jun 6, John K Clark wrote:
>> Does it or doesn't it require the mind to have
>> emotions?
>
> As I've said before emotions are a dime a dozen but intelligence is
> rare.
>

I think I understand better where you're coming from on some of this
stuff now. I still don't agree with you, though: intelligence might
be very rare, but my ethics are firmly on the side of intelligence
only being useful if it reduces suffering. (That's a pretty
streamlined statement, but I think you get my gist.)

James



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT