From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sat Jul 23 2005 - 13:08:56 MDT
Russell Wallace wrote:
> On 7/23/05, Robin Lee Powell <rlpowell@digitalkingdom.org> wrote:
>
>>"Power over others" will *NEVER* converge, because everyone has a
>>different someone(s) they want power over.
>
> You'd think so from calculating the abstract logic, wouldn't you? But
> that's empirically not how it works - "power over others" does in fact
> converge, when you look at what people vote for today. (I could come
> up with a story to explain why in terms of evolutionary psychology,
> but the empirical fact is that it converges.)
Human babies might have such simple minds that a purely individual
extrapolation of their volition would end with a superbaby pooping a billion
diapers per second. I, as I am right now, would not choose to take them from
their mothers and consign them to that destiny.
So you cannot presumptively ban "power over others" as a possible result of
CEV, not unless you want to, on your personal authority and responsibility,
wrench babies off what we conceive to be the track to humanity.
Russell proposed letting the AI programmers hardcode, for all future
civilization until the end of time, a definition of adulthood. I don't think
that's a very good solution.
On the other hand, it's also clear that modern-day humans enjoy meddling in
each other's lives a great deal, so that in a democracy it's easy to get a
vote on banning offensive speech. Libertarians argue that banning offensive
speech is wrong, and to this end, propose many arguments about both the actual
consequences of banning offensive speech, the actual consequences of offensive
speech, propositions about fair play in the realm of ideas, and inalienable
rights and the value of personal freedom, and warnings about the undesirable
personal consequences of allowing such a framework to exist. I think each of
these arguments is correct, in the sense that I would still so believe even if
I knew more and thought faster. If this is not true of other humans, being
the people that they are - if, being smarter, they would yet prefer to censor
and be censored by the force majeure of superintelligence, nor allow that any
enterable realm of freedom should exist - then I'm not sure in what sense
freedom of speech could be said to be what humanity wants. At that point I'd
end up in a genuine moral dilemma, or rather the Last Judge would, that I
still haven't thought of any morally acceptable way to resolve. I do know
that it would not be morally acceptable to me, to, on my own authority, impose
freedoms on those who would *never* wish them.
In short, I think that the present fact that people vote for a lot of stupid
things, is one of those things that we would outgrow if we could only grow up
without destroying ourselves.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT