From: Eliezer S. Yudkowsky (firstname.lastname@example.org)
Date: Tue May 03 2005 - 10:29:38 MDT
Damien Broderick wrote:
> is an interesting review of my new sf novel GODPLAYERS. The reviewer is
> especially exercised by the fact that my posthuman characters are not
> immediately understandable -- indeed, beyond empathy -- by human standards:
> <the frustration level mounts as one waits in vain... for characters...
> to display any hint of a genuine inner life as they move randomly from
> scene to scene, world to world, reality to reality. Perhaps Vorpal
> homunculi do not possess inner lives, and Broderick's point is that
> these seeming superhumans, for all their power, are soulless automatons
> without a shred of humanity.... Surely there should be some character,
> somewhere in a novel, to which human readers can feel connected. ...As
> the sequence of events grows increasingly frenzied, with ever-greater
> reliance placed on what might be termed info-splatters, the lack of a
> deep humanistic substrate left this reader, at least, with no ground to
> stand on. >
> I'm torn in my response to this. On the one hand, it wouldn't make much
> sense to write about posthumans as if they were representations of the
> people down the road, or in the next room. On the other, I have tried to
> ground the fairly breakneck narrative within thematic structures and
> reverberations recognizable from myth, dream, and the traditions of
> science-fiction itself when it ventures upon the superhuman. Greg Egan
> met with this same objection, of course, and so, in various degrees, did
> John C. Wright and Charlie Stross. Maybe it's an artistic problem beyond
> solution -- for humans.
I agree with this objection to Greg Egan's, Charlie Stross's, even some of
Vernor Vinge's stuff, and yes, your own recent work. (John C. Wright did okay
in his first two books, haven't finished the third.) Despite all Vingean
rules there is no good reason why transhumans, especially in a work of
fiction, should not have strong emotions the reader can empathize with. If
you need to explain why, tell your readers that Eliezer designed 'em or that
they're outgrowths of humans.
Rationality is not about emotionlessness, and neither is intelligence, whether
"intelligence" is interpreted as g-factor or as the combination of g-factor
and rationality that actually makes people powerful and effective.
Rationality is the art of attaining a map that reflects the territory, of
arriving to the correct answer on questions of simple fact.
One comes to me and claims: the Way opposes this emotion, this goal, this
morality. I should reply: Where is the false belief that I must hold, to
feel this emotion, pursue this goal, propound this morality? It is a subtle
question, and the claimant may reply in a fashion I do not expect. Being the
person that I am, it may be that I must possess a false belief in order to
feel that emotion, pursue that goal, propound that moral philosophy. If I
should feel fear and horror at the sight of an approaching wire I believe to
be electrified, and the wire is not electrified, then I shall relinquish my
fear and horror if I pursue the Way. If I should feel calm at the approach of
the wire, believing it not to be electrified, and the wire is electrified,
then the Way opposes my calm. If I believe that the members of the tribe
across the water are subhuman, and propound a moral philosophy which advocates
their casual killing, I may reconsider upon learning of our shared DNA and
shared neural architecture, learning they are not different from me as people.
And the experience may lead me to reconsider not only the conclusion, but
the premise, wonder if the hatred which leads to such mistakes is a part of
myself I wish to keep. I may come to a different conclusion in that moral
introspection, depending on whether I believe the hatred was placed in me by
Zeus, or if I know the underlying emotion of hatred was inscribed in me by
natural selection. I wish to feel those emotions, pursue those goals,
propound those moralities, which, being myself, I would feel and pursue and
propound if I knew the correct answers.
That's rationality according to a rationalist. Spock is an idiot designed by
Hollywood scriptwriters who knew less than nothing of rationality. Spock
forms the template for most fictional transhumans. Small wonder that the
audience has nothing to empathize with, if the writer thinks that rationality
requires the rejection of empathizable emotion.
The Singularity is not an ironic commentary on the rate of change. Many
people in the futurist crowd - especially those who think that futurism is all
about being ever so avant-garde and hip and ironically detached - seem to have
picked up on the Singularity this way. I am speaking specifically here of
Cory Doctorow and Charlie Stross. So they write hip, avant-garde, and
ironically detached Singularity fiction, where the Singularity is interpreted
as an excuse to toss in various bits of technobabble, throw the main character
out of a job, and show a background social fabric in the process of
disintegration under too-rapid change. Ooh, avant-garde! Post-modern!
Artistic! Emotionally uninteresting! And they treat the Singularity the same
way in real life - a concept that makes them feel more detached, not more
involved. Compare to _Staring into the Singularity_. Artistic? No.
Avant-garde? No. Hip? No. Ironically detached? No. Well-written? No.
Passionate? Yes. _SitS_ is not intended as a work of fiction, but frankly,
it has a more interesting central plotline than most that which is billed as
When you wrote about _SitS_ in _The Spike_, you wrote, "Well, of course, one
smiles, recalling the exaggerated postures of adolescence." Being rather fond
of that youthful Eliezer, you attached no particular utility to hurting his
feelings; yet you found it necessary to insert *something* that would make
clear your emotional detachment. Why? Because you had to avoid, and
automatically avoided, a scenario in which your readers might think you cared
about something - which would be a terribly unhip flaw in a self-consciously
hip book about the future. In nonfiction you can plead the requirement of
believability, of making the reader go on treating your nonfiction as
nonfiction. But if you apply the same reasoning to your fiction, and your
readers then complain that your fiction lacks passion, well, jeebers man, what
else did you expect? How do you expect to make your readers feel passion if
you yourself regard passion as terribly unhip? Characters in fiction are
supposed to feel, feel intensely, to scream in the night when the plot calls
for it, and if you flinch back from that you have no story. The "thematic
structures and reverberations recognizable from myth, dream, and the
traditions of science-fiction itself when it ventures upon the superhuman"
cannot repair this flaw. No amount of thematic dressing and mythological
allusion can repair a plot that lacks drama or characters that lack emotion.
First you have to feel passion, and then you have to put it into your writing;
and if you have forgotten how to do this, or even just forgotten to do it in
this particular case, then all craftsmanship is for nought - just as passion
is futile without writing craftsmanship.
It does require a certain amount of creativity to write transhumans who end up
in enough trouble that they have something to feel about. You have to invent
new plots and break loose of the conventions of the transhuman genre, since
these conventions are wrong and destructive of story. The first and most
fundamental error is to think of your fictional transhumans as gods beyond all
troubles. The second fundamental error is to think that if you don't festoon
your characters with emotional incomprehensibility then the readers will not
know they are transhumans. These are the conventions of the Singularity genre
that were laid down by self-consciously hip authors; they are conventions
which are destructive of story. So do better. Place human characters in a
world where transhumans exist in some fashion that does *not* make the
transhumans cold and remote, nor hiply detached and postmodern, nor festooned
with decorative incomprehensibility. Or let your characters be transhumans
themselves, if you dare, and give them something to care passionately about.
John C. Wright managed this, and I therefore do not accept that it is impossible.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT