Re: SIAI: Donate Today and Tomorrow

From: Mikko Rauhala (mjrauhal@cc.helsinki.fi)
Date: Fri Oct 22 2004 - 18:24:30 MDT


Fri 2004-10-22, 13:28 -0400, Eliezer Yudkowsky wrote:
> I ask of SIAI's donors: Speak up, and hold your heads high without shame!

Well put. I thought I'd drop in my 2 cents, now that we have those here
in Finland and all. I don't know if all I'm about to say is good PR, but
it's true, and it might provoke some helpful thinking in some people.
Hopefully.

I have very little shame. I have rather little pride too, though I do
have a distinct knowledge of being "better" than most people in many
ways - intellectually, rationally and from an altruistic ethical
standpoint, for instance. But all this is unimportant.

I have been a regular SIAI donor since the time they first started
soliciting them. Some time later I outed myself by adding SIAI onto my
signature. I didn't really care about the (doubtful) prestige, but I
thought providing a link there might be useful. In the same spirit,
while I'd personally usually rather not draw attention to myself, I
later allowed the publication of my name, with amounts contributed, on
SIAI's newsletters, to perhaps provide inspiration to others.

I used to have some aspirations to doing actual work towards the
Singularity, but I've pretty much dumped those mostly due to clinical
depression accompanied by certain issues with attention span and
motivation. Since then I've also come to the conclusion that I wouldn't
want people like me in an AI team anyway. To be more precise, us
misanthropes would introduce some unwanted risks into such a project.

Yes; though I may be fond of certain spesific portions of humankind, as
a whole I rather dislike it. "Hate" might be too strong a word for my
general emotional intensity. "Loath" or "detest" might be more accurate.
I certainly don't feel (note the choice of word) that humanity
"deserves" to live, let alone reach Singularity, and I very much suspect
I'd derive no small amount of pleasure from witnessing an observable end
of the world closing in. (No, my own survival isn't much of an issue.)
But all this is unimportant.

I am, by choice, an altruist - a choice originally made when my
misanthropic tendencies were more on the background, but a choice I'm
sticking with nevertheless. I also aspire to rationality, as far as is
possible for me. As an aspiring rationalist, I see the usual Existential
Risks looming near. I also see no other plausible defense scenario than
a successful, friendly Singularity - never mind the intrinsic
desirability of such a scenario even ignoring Existential Risks
altogether.

I find that SIAI's Friendliness approach is by far the best that I have
means to know about. I don't think SIAI has a good chance of success -
rather the opposite - but that is irrelevant when facing an Existential
Risk. I therefore ignore any unhelpful feelings of generic animosity and
helplessness and contribute. I do this currently at a rate of $400 a
month, probably due for yet another increase at year's end. This
represents a significant portion of my net income as a sysadmin in a
government university.

So, what I guess I'm getting at is that, as far as saving the world
goes, if so few seem to be able to even try to do as much as a
Singularity-pessimistic and clinically depressed misanthrope living on a
midrange government salary, there's something wrong with this picture.

But hey, though the way there may be frustrating, it's all win/win for
me in the end. Cheers.

-- 
Mikko Rauhala   - mjr@iki.fi     - <URL:http://www.iki.fi/mjr/>
Transhumanist   - WTA member     - <URL:http://www.transhumanism.org/>
Singularitarian - SIAI supporter - <URL:http://www.intelligence.org/>




This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:22:47 MST