From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Jan 23 2002 - 20:45:30 MST
Randall Randall wrote:
>
> On Wednesday 23 January 2002 22:14, Eliezer Yudkowsky wrote:
>
> > Sorry, not mine. I make this statement fully understanding the size of
> > the claim. But if you believe you can provide a counterexample - any
> > case in, say, the last year, where I acted from a non-altruistic
> > motivation - then please demonstrate it.
>
> Later, on the extropians list, Eliezer Yudkowsky wrote:
>
> > Personally, I just happen to own a copy of "Queen of Angels" with Greg
> > Bear's autograph on the page with the AI's awakening. Gee, I wonder how
> > that happened... say... wasn't Greg Bear at Extro 5?
>
> Could there have been some altruistic motivation for taking up your time and
> Greg's time autographing this? :)
Sure. Even a complete altruist still has some subgoals that intersect the
altruist. Mentally I label these CIPBs, which stands for
"context-insensitive personal benefit". As any altruist knows, you've got
to be careful around CIPBs. There are some basic emotions which create
CIPB goals as direct drives, and some political emotions which promote
CIPB goals created for other purposes. There are subtler problems, but
that's 90% of them right there. The goal is not actually to eliminate all
CIPBs, but instead to bring their intensities and content into
correspondence with the goal system that would be produced by, say, a
Friendly AI.
Again, the point is not to eliminate all CIPBs, but to bring them into
correspondence with the local subgoals that would be naturally produced by
a cleanly altruistic system, and keep them from going any further.
The simple but wrong description would be to say that I got Greg Bear to
autograph the book because it would give me a warm fuzzy feeling [note:
CIPB] and the warm fuzzy feeling would increase my level of mental energy
which would increase my Singularity productivity. This is a valid chain
of reasoning as long as it's not being given undue strength by innate CIPB
drives or the political CIPB-favoring emotions. The complex and correct
decision is that rather than re-evaluating this class of decision for each
occasion, I decided to "validate" this particular CIPB as an okay emotion
most of the time, and rely on my category-recognition system to catch any
specific cases where it wasn't.
The point is that I got the autograph to have fun, because I've learned
that I need a certain level of fun to stay productive. So I've validated
many, but not all, of the fun-producing emotions. But there are other
CIPBs that I have not validated because I don't expect a net benefit to
the Singularity. Getting Greg Bear to sign the book was a very low-risk
form of fun that did not consume a significant amount of time and may even
have minor direct Singularity benefits at some point or another.
Of course, when it comes to fun, I rather doubt that I'm at even a local
optimum - i.e., I am undoubtedly having either too much fun or not enough
fun - but I do seem to be at the best optimum I can reach by deliberate
tweaks of the system. The deeper point is that even if there are still
drives pushing on the CIPBs in one way or another, my long-term goals and
strategy are very firmly aligned on the Singularity. I can't totally
revise the fun-management system because the basic goal, mental energy, is
too tangled with the goals that manage the activities that renew mental
energy - there is no way to maintain mental energy through pure reasoning;
all you can do is optimize mental energy against effort and risk by
managing the existing goal system. When it comes to strategy I have a lot
more leeway to clean things up. So the mental energy fun management
system isn't as clean as I'd like, but even so it amounts to a few drives
messing around with my local CIPB intensity levels for things like whether
or not I should listen to an MP3 or keep on working. If you're talking
about something like choosing one Singularity strategy over another
because it affects my chances of becoming a celebrity, then no frickin'
way.
And yes, Ben, the inside of my head really does look like this; yes, it
took a few years and a lot of practice to get there; and no, I haven't
been struck dead by lightning for daring to defy the tin gods of the
genome. You should try it sometime. It's fun.
[CCII]
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT