From: Eliezer S. Yudkowsky (firstname.lastname@example.org)
Date: Mon Nov 13 2000 - 21:49:31 MST
Mark Plus wrote:
> Zey advocates enhancing humans in definitely a Transhumanist way, but
> criticizes Moravec and Kurzweil for speculating that advanced AI's will
> subordinate humans. He points out that this scenario resembles the
> Neo-Luddite proposal to subordinate humans to other species or to "Gaia."
> Why do humans have to become subordinated to anything, he asks?
Well, technically speaking, it's because humans have social adaptations which
draw a direct line from "more power" to "dominance and coercion". Anyone who
doesn't have advanced degrees in evolutionary psychology AND futurism is going
to see that mental and emotional imagery, a hapless captive of the built-in
sequitur. Whether they believe the scenario or not, they're going to think
that the Extropians have something to prove; they will mentally frame the
issue in terms of whether the "inherent tendency towards dominance" is
balanced by "fuzzy feelings towards the less fortunate".
The idea that only in our oddly perverted, evolved worldview is there any
connection whatsoever between greater power/intelligence/ability and social
dominance - that things after the Singularity can be totally different, and
that this is in fact the *default* scenario - well, like I said, you need to
be deeply into evolutionary psychology and futurism before that kind of
thinking becomes natural.
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT