From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri Mar 30 2001 - 13:31:54 MST
Arctic Fox wrote:
>
> A ship is so unlike a human we can safely use "she" without falsely
> implying a link between ships and people. What to call an AI has to be
> considered much more carefully. Do we want an AI to be anthropomorphized
> - this could give people a false impression of what an AI is.
The question I am now pondering is what the anthropomorphic implications
of using "she" would be, and whether there's anything too obnoxious that
will fall out of it. Likewise for giving AIs names like "Aileen" instead
of "Ailerin".
It would be depressing to see male critics starting to accuse the
femininely pronouned Sysop of well-meant maternal interference instead of
paternalistic dominance. Funny, but depressing. It mean that the
aforesaid critics were not just wrong, but predictably and boringly
wrong. It would also mean that males (and females?) had been
unconsciously anthropomorphizing male stereotypes to minds in general,
even when "ve" or "it" were used, and thus that the whole
gender-neutrality effort had been doomed from the start.
I don't know what the effects would be on the female audience, if any.
> If we truly believe that Singularity will end all suffering is it
> unethical to not select memes that will aid acceptance of Singularity? I
> suppose by this I mean the way of presenting facts, certainly not
> falsifying evidence or covering up facts.
What worries me is that if I thought humanity would still be
pre-Singularity in a hundred years, I would use gender-neutral pronouns
and damn the torpedoes, on the grounds that we'd have to get used to it
eventually. That won't happen. But if I decide not to use gender-neutral
pronouns, it will be a decision that directly depends on my belief that
I'm planning for this generation, and not someone's grandchildren. And
every time I make a decision like that, a little more "normal" burns away.
The fact that I spend my life dealing with long-term problems does give me
enough social slack to drop a fringe long-term problem in somebody else's
lap rather than taking care of it myself. But I'm still reluctant.
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT