Re: An essay I just wrote on the Singularity.

From: Samantha Atkins (samantha@objectent.com)
Date: Fri Jan 02 2004 - 02:35:32 MST


On Wed, 31 Dec 2003 09:40:08 -0800
Michael Anissimov <altima@yifan.net> wrote:

> Robin, this is an interesting and entertaining essay! Congratulations
> on getting the motivation to write down some of your ideas and reasoning
> regarding the world-shaking issue of how humanity ought to approach the
> Singularity. I disagree with the way you present/argue some things
> though, so here I go with all the comments:
>
> 1. Why do you call Singularitarianism your "new religion"? I know it's
> basically all in jest, but thousands of people have already
> misinterpreted the Singularity as a result of the "Singularity =
> Rapture" meme, and I don't think they need any more encouragement. I
> would personally prefer that Singularitarians have the reputation of
> being extremely atheistic and humanistic.

I don't see what believe in God or gods has to do or not do with it actually. Saying it is "extremely atheistic" is as out of place as considering it to be a religion or an aspect of one. The term "humanistic" implies even less to a massively greater than human intelligence in part making possible the transcending of much of what we now consider as characteristic of "human".

>
> 2. Like Tommy McCabe, I too have a problem with the "FAI means being
> nice to humans" line. This gives a lot of people the mistaken
> impression that FAI is going to be anthropocentric, unfortunately.

Well, the first level goal is that it be "nice" to humans lest we commit a very intricate form of species suicide in giving birth to it. Friendliness is not limited to humans necessarily just because it includes "being nice to humans".

> Anyway, congratulations again on writing something. Politics is indeed
> largely irrelevant. This becomes clear around high SL2, as a matter of
> fact. At the very least, politics is something we cannot influence
> unless we pursue high-leverage goals, like devoting our lives to
> politics, or, far better yet, building a Friendly AI.
>

If the political events between now and takeoff result in an extremely oppressive government that shuts down the research and much or communcation or if it devolves into universal war, terror and teror of terror, then politics is very much relevant. A certain amount of freedom and stability is required. Also, spinning what technology can do to solve current crises given the vision and will may well bring us more stability and more funding as well.

Some days it feels as if FAI can be a rationalization for not wanting to deal with the messy human stuff. It is the ultimate geek out. We create the seed of that which is massively smarter than humanity and massively less fucked up and it fixes everything to the degree it can be fixed thus saving us the trouble. Sweet! But some days it doesn't smell quite right.

- samantha



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:43 MDT