From: Robin Lee Powell (rlpowell@digitalkingdom.org)
Date: Sat Oct 24 2009 - 11:15:20 MDT
On Sat, Oct 24, 2009 at 09:37:25AM -0700, Tim Freeman wrote:
> But there's more to "Extrapolated" than that. Quoting from
> http://intelligence.org/upload/CEV.html as read on 24 Oct 2009:
>
> In poetic terms, our coherent extrapolated volition is our wish
> if we knew more, thought faster, were more the people we wished
> we were, had grown up farther together; where the extrapolation
> converges rather than diverges, where our wishes cohere rather
> than interfere; extrapolated as we wish that extrapolated,
> interpreted as we wish that interpreted.
>
> "Knew more" and "thought faster" are close enough to "if they had
> true beliefs" that I don't care about the difference.
>
> But the other contrafactual things don't seem desirable:
>
> "were more the people we wished we were". This brings to mind
> people with repressed sexuality who want sex but think sex is bad
> so they don't want to want sex. This is based on a false belief
> -- sex isn't bad in general. But this person really wishes they
> didn't want sex.
>
> "had grown up farther together": there are toxic people who, if I
> had grown up farther with them, I'd be completely useless. This
> became utterly clear to me as a consequence of my first marriage.
> This part of Extrapolation is just bad.
>
> Can anyone make a decent case for these dubious parts of
> Extrapolation?
I take it as a more or less ordered list; *first* fix those people
in the ways the obviously need fixing ("know more", "thought
faster") *then* imagine what they would want to fix about themselves
("were more the people we wishedwe were"), then imagine them
learning and changing in response to the people around them ("had
grown up farther together"), drop all the things that are different
between people but don't actually matter very much (like, say,
particular preferences in food or sex or whatever) ("where the
extrapolation converges rather than diverges, where our wishes
cohere rather than interfere"), and ask what the people themselves
thus extrapolated would think of the results ("extrapolated as we
wish that extrapolated, interpreted as we wish that interpreted").
[reordered]
> In general, I want what I want, and except when the AI knows I'm
> mistaken about facts, I therefore want the AI to give me what I
> want. That's the "Volition" part. There are other people so there
> has to be some compromise between what everyone wants so the AI
> can do one thing; that's the "Coherent" part. Other than
> compensating for mistaken beliefs, I don't see any use for the
> "Extrapolation" part. I don't want the AI catering to
> hypothetical ideal people, I want to the AI to give real people
> what they want.
The vast majority of people in the world (China, India, South
America, Africa) are still more-or-less medieval peasant farmers;
maybe not literally, but the mentality is going to be about the
same. The world they would envision without more knowledge and more
time to think about it and so on is going to look very much like the
stereotypical Christian heaven: you get to lie around and eat
grapes, and you never do anything because your every need is take
care of. No thank you!! That's hell to me, and after a week or a
month it would be hell to them too, but it's what they want right
now. That (and similar issues, like the millions upon millions of
people who really want a house and a dog and 2.5 kids and 2 cars)
is what the extrapolation step is about.
*You* might not need the extrapolation step, but the mere fact that
you're reading sl4 makes you, what, one in *one hundred million* in
the general population? Please, have pity on everybody else. :)
-Robin
-- They say: "The first AIs will be built by the military as weapons." And I'm thinking: "Does it even occur to you to try for something other than the default outcome?" See http://shrunklink.com/cdiz http://www.digitalkingdom.org/~rlpowell/ *** http://www.lojban.org/
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:05 MDT