Re: Collective Volition, next take

From: Russell Wallace (russell.wallace@gmail.com)
Date: Sat Jul 23 2005 - 15:58:16 MDT


On 7/23/05, Eliezer S. Yudkowsky <sentience@pobox.com> wrote:
> I do not believe it to be true, as an empirical fact, that people with IQ 100
> and IQ 150 display the same distribution of personal moralities. For a start,
> they don't display the same distribution of theologies.

You're probably right there - people with IQ 100 tend on average to
have higher moral standards.

I'm going to sign off shortly - don't have the energy to debate the
topic any more tonight - but I'll wrap up with:

> How do you know this holds true for every one of the billion different
> extrapolation dynamics I might eventually decide to implement?

Obviously I don't have the list of a billion different dynamics you
might decide to implement, and it's not clear what the criteria for
the decision will be. What I can say is that _to the extent that the
system forces everyone to submit to the will of the Collective_ (as
was implied by the original paper, a position you seem to be partly
retreating from now - if you are retreating from that position, good),
you're proposing to create Hell. To the extent that it relies on a
"superhumane" entity to make the decisions, it's a mirage and the real
decision making power will come from something else (most likely the
programmers, though maybe the phase of the moon for all I know; I'm
not privy to the details of SIAI's plans). To the extent that it does
something else, I'll wait to criticize until it's clearer what that
something else is going to be.

- Russell



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT