From: Eliezer S. Yudkowsky (email@example.com)
Date: Fri Jan 25 2002 - 21:37:39 MST
Ben Goertzel wrote:
> Love, my friends! LOVE! Lust! Rage!! Excitement! Wonder! passion!
> Yes, you can have the Singularity, and you can have THESE TOO! All for own
> low, low price...
Michael Anissimov replied:
> Amen, but in moderation. And don't forget about exploring alternate
> states of consciousness, that is valuable for the Singularity too.
> Lust I'm less keen about. I don't believe in rational love, love
> wastes too much time and is too much like a valueless drug. When
> being on the net incites more emotion than "real life", would you
> spend all your time on it, anyone?
Sigh. As always, Ben Goertzel thinks I'm a fuddy-duddy just because I
don't do sex, drugs, drinking, smoking, fighting, or gambling. But I did
once go out on the dance floor on a science-fiction convention, which
should permanently put an end to the rumor that Singularitarians have less
This may come as a surprise to you, Ben, but I've never been a fan of the
sour-grapes theory of ascetism. I think sex is a great thing. I just
don't want to find out from personal experience. Who here, if they'd
managed to go through their whole life up until this point without ever
tasting chocolate, would take a bite *now* and spend the rest of their
life trying to resist the temptation to eat donuts? Well... probably
quite a few people, actually. The point I'm trying to make is that I have
to expend willpower to control my diet, but I don't have to expend
willpower to control my sex life because I decided early on not to have
one. This doesn't mean I think that sex is worthless or morally wrong. I
think sex is a great thing, for other people. If I ever had the
opportunity to help out with a friend's love life, I'd do it. Helping a
friend isn't likely to trigger a mental chain of events that ends by
sucking up all my Saturday nights until the end of time.
Am I at risk for alcohol addiction? Would seeing the world through a rosy
glow, for just a few hours, make the rest of my existence harder to bear?
I'll never know, because there's a very easy way to avoid finding out.
Love, of course, is a more interesting issue, as I happen to be a hopeless
romantic. As such, I believe - call me loony - that love is a cathedral
you build together, a rose that you grow and water together, *for its own
sake*. It is something that you expend time and effort *on*, because you
care about it, not an investment that pays back your expended time because
only one of you has to go on the weekly shopping trips. You put time,
mental energy, and caring into love, and what you get in return is love.
Sort of like children, albeit to a lesser extent. If you try and impose
the requirement that love serve as the most effective means to some other
end (i.e., the Singularity), the necessary conditions soon become so
exotic as to be entirely impossible.
Let me put it this way: I suppose that a sufficiently intelligent,
sufficiently altruistic female reader of SL4 might email me in response to
this message and say: "Hi, I'm an independently wealthy 25-year-old woman
who's totally devoted to the Singularity. I've decided that the most
efficient way for me to advance the Singularity is to act as your support
mechanism. I'm dyslexic, so you can't recruit me for the programming
project, or turn me into a Singularity writer [either of which would make
her time too valuable to be spent primarily on supporting me]. I
therefore order you to marry me." And I'd do it. Similarly, someone
might say: "Hi, I'm a billionaire. I want to fund your project but your
ascetism makes me nervous. I will therefore donate a million dollars to
SIAI, on the condition that you give me your word of honor that you will
start dating." And I'd do it. I'm a pretty easy guy to manipulate, as
long as it doesn't violate my ethics. But both cases seem rather
These impossible conditions aren't an artificial attempt on my part to
avoid intimacy or whatever. It's a naturally impossible set of
conditions, stemming from the fact that love is something you put time
*into*, not something that you get time *out* of.
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT