From: Keith Henson (firstname.lastname@example.org)
Date: Thu Nov 04 2004 - 07:16:17 MST
At 03:53 AM 04/11/04 -0500, you wrote:
>Keith Henson wrote:
>>Coming at this business from the EP side, I think Collective Volition
>>might not even be a good idea, though it might be that I don't understand
>>some special meaning beyond the obvious meaning of the words.
>You're missing the "more the people we wished we were" set of
>transformations within "knew more, thought faster, more the people we
>wished we were, had grown up farther together". If you give a lecture on
>the evolutionary psychology of privation, and the lecture is correct, and
>the listeners understand it fully, and they think that this is not who
>they want to be when they grow up, then it wouldn't appear in their
>collective *extrapolated* volition.
I really hope you are right about this Eliezer. Unfortunately, it might be
that machines that were able to extrapolate into "more the people we wished
we were" who "knew more, thought faster, more the people we wished we were,
had grown up farther together" might make CV decisions that would freak out
you and me. How important to our personalities is this ability for a
tribe's warriors to go berserk? How much does it underlie even our
unstressed, relatively rational behavior in good times?
It's not a question that's easy to answer. You can get a hint of it in
functional MRI scans. My guess is that because war has been such a gene
filter at least since the big cats quit being serious predators 2.6 million
years ago, it is going to be extremely difficult to change and have
anything left we would consider human.
If you are going to try to keep humans out of war mode by keeping them in
"good times" than you just have to mess with their interest or ability to
reproduce because while normal reproduction won't keep up with nanotech, it
will eventually produce a resource crisis. To me it is a dicey
matter. Collective volition might conclude that periodically killing each
other off with rocks to hold down the population is what people *want.*
>I'm not going to hook up a superintelligence to the decisions of
>Earthlings the way we are now. Geeze, do I look that suicidal?
No, but I am not certain that the decisions of improved Earthlings are
going to be an improvement. I find it scary that they might be "improved"
in ways that from a godlike stand off would *be* an improvement, but you
sure wouldn't want to live there.
As I say, I hope my misgivings about this are wrong and you are right.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:49 MDT