JOIN: Aleksei Riikonen

From: Aleksei Riikonen (aariikon@cc.jyu.fi)
Date: Mon Dec 16 2002 - 14:23:09 MST


In accordance with (optional) list customs, I'm making my join post here.

I'm a lot like many others on this list. Intellectually inclined, and a
couple of standard deviations away from the average IQ, sure, but probably
not very close to being among the brightest minds here. Have always liked
futurism and been interested in all sorts of sciences. Math, evolutionary
psychology, analytic (and other forms of rational) philosophy - fields of
this sort top my list, though. A preference to which many of you can relate,
I'm sure.

Technically inclined I have not been, though. Think of some stereotypical
theoretical physicist, and there you have it. I actually switched from
majoring in theoretical physics to math, because the experimental work even
the wanna-be theoretical physicists have to do during their first years (at
least in the university of Jyvaskyla, though I guess this should be pretty
universal) somewhat turned me off.

A year or two ago I read Eliezer's "An Introduction to the Singularity",
after being directed to it by a friend (Mikko Rauhala, among the readership
here). That's when the future shock hit me. I was ecstatic.

Before being introduced to singularity concepts, I actually held the luddite
view that _extremely_ forceful regulation of technological development would
be a good idea, as it seemed to me that humanity as it is is just screwing
things up with all this power we've bestowed on ourselves, and is bound to
screw up even worse if "progress" isn't very strictly regulated. I had
briefly toyed with the idea of creating altruistic AI's to solve this
problem, but with my considerably limited and regrettably uninformed
intelligence had thought that such things would not hold great enough
promise for too long a time.

Eliezer's texts, and the others to which my attention soon spread,
fortunately dissolved my luddite views. A stereotypical singularitarian,
that's what I am now. :)

But to be truthful, I probably differ from many of us in my motivations. I
am not here because I find all this stuff very cool (which I do), but
because of my strong ethical principles. I think many would feel that I am a
somewhat moralist person, and despite me liking technology, I would not have
a problem turning against it Unabomber-style if I honestly thought that that
would be the ethically correct decision (which it would be in some
theoretically possible worlds - fortunately we don't seem to be situated in
any of them).

Presently, it is my view that the ethically best course of action for me
would be to switch to major in computer science (or at least to get out of
math), and for the most part to proceed to amass wealth to be donated to
SIAI. I think I am better suited to the task of acquiring funds for the SIAI
than to the task of becoming a FAI programmer or some such, as I find it
probable that several underemployed entities here and elsewhere would make
more talented FAI programmers than what I would make.

This is becoming a rather long mail, but I'm trying to make it interesting
enough to meet list standards. At least I myself would be interested in
this. :) And here I might also note, that English isn't my native language,
which I can tell reading this. Probably many of you can as well.

A few more words about the ethical aspects of me, as this is the
not-boringly-stereotypical part. I am presently comparatively actively
involved in the movement for the rights of non-human sentients on our
planet. Many of you know this movement as the animal rights movement, and
were it not for the singularity, I would actually consider animal rights
activism to be an ethically very productive pastime even for a rational
person (my reasoning for this is not as simple as one might think, but is
nonetheless irrelevant, as we live in a world where it is a high priority to
pay maximal or near-maximal attention to singularity matters).

In the future, I'm hoping to shift a larger portion of my resources from AR
activism to singularity activism. One reason, why I haven't yet done so (not
very efficiently, at least), is that I prefer to spend my time with people
who are ethically strong and quite non-self-centered, like myself (in some
ways, at least). And while the present "singularitarian elite" consists of
such entities, egoistical persons abound in technophile communities, as they
do almost everywhere else. And if I were to strive towards effectively
spreading singularitarianism, for example, this ethically somewhat weak
majority of SL2 and SL1 audiences would make up the majority of the people I
would have to deal with.

With the above I do not mean that I hate self-centered people, or that I
would act aggressively or in some other irrational manner towards them. I
do, however, have a strong (and often masked) feeling of superiority towards
them, and prefer the company of people who I consider to be emotionally and
ethically well-developed. I admit that this behaviour, as a form of elitism,
is irrational. Note also, that I do not claim morality to be objective, or
my feeling of superiority to have non-subjective grounds.

I was born in the year 1980, in the country of Finland, which is situated in
Scandinavia. I continue to reside here.

I haven't yet read all of the SL4 archives. As long as this statement
remains true, my enthusiasm to participate in the discussion here will
remain somewhat limited, as it should. And as long as I am not an active
poster, I probably will not subscribe to this list (except for the time
which I need to send the occasional post), as I feel it to be more
convenient for me to skim through the recent posts in the web archives. I am
mentioning this because I think that everyone here might not realise that a
large portion of the lurker population of SL4 might not even be subscribed.
The readership may be substantially larger than the subscriber population.
(I hope to be reprimanded for improper behaviour if this point has been
raised on the list before - remember to read the archives before posting,
everyone! ;)

Anyway, in the future I might start a discussion regarding the present
non-human sentient population of our planet and the singularity. I mean, it
shouldn't be just the humans whom a superintelligent FAI will offer the
opportunity to transcend etc. It seems that singularitarians often miss this
point in their writing, even though it is a lucid fact that many non-human
animals are more sentient than human infants, for example. (I again hope to
be reprimanded, if this topic has been dealt with exhaustively.)

I'll leave you with a link to a Prometheus Society article regarding the
intellectually exceptionally gifted. People, who have felt like "An
Outsider" because of their intellect might find this piece of writing to be
somewhat entertaining (though it gets boring around the middle, but there's
better stuff there too):

http://www.prometheussociety.org/articles/Outsiders.html

--
Aleksei Riikonen - aleksei@iki.fi


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT