From: Samantha Atkins (email@example.com)
Date: Thu Jun 20 2002 - 22:04:00 MDT
Eliezer S. Yudkowsky wrote:
> is not about the merits of any one course of research, but a question of
> how we are to fulfill those moral obligations that devolve upon us when
> we become aware of the Singularity. I think that first we have to
> resolve the moral issue of "Can we speed up the Singularity? Do we have
> a moral obligation to try?" before considering the pros and cons of
> particular approaches. This is why I did my best to mention a broad
> spectrum of paths to the Singularity in my reply to Kurzweil,
> concentrating specifically on those research directions (brain-computer
> interfaces, integrative computational neurology) that Kurzweil has
> singled out as important.
The morality of the question is difficult to be fully clear about. It
is not at all clear whethet an earlier Singularity is in fact a good
thing. It depends strongly on the how the Singularity is shaped. If it
is shaped wrongly it could mean the deaths of all of us and an SI born
within it may not survive long either.
On funding various types of research that would be useful to our fondest
goals, I am of two minds. In the marketplace funding something at the
wrong time can simply waste one's resources to no great effect.
Kurzweil has a proven track record of funding new technology at a time
when it is accepted and generates adequate revenue. He has a track
record of innovation. I personally wish some of his innovations were
more accessible to other researchers and efforts, but I have to give him
> If Kurzweil chooses to become an activist on behalf of *any* technology
> that he believes is the most critical, the most neglected link on the
> path to Singularity, I will applaud the ethics and responsibility of
> that moral decision *before* raising any issues with his choice of
> technology. If Kurzweil decides to throw his full efforts behind
> implementing the Singularity, others will follow in his footsteps, with
> different choices of key technology, and whatever real critical links
> exist will be funded eventually.
I see him building awareness and momentum now and stepping back, for
now, from taking any technological lead. Perhaps he believes more can
be accomplished by him at this point in that capacity and/or that
technology is proceeding apace in any case. I am a little worried to
see you or anyone casting moral aspersions on another if there actions
are not what you would have done in their place. I don't think that is
> What I fear is a world in which people
> hear about the Singularity and use it to rationalize whatever they were
> already doing. What I fear is a world in which Kurzweil, as the first
> presenter of the Singularity meme, inadvertantly inoculates his audience
> with the idea that we can derive comfort from the Singularity concept
> without *doing* anything to achieve the actual Singularity. If this
> happens, spreading the word about the Singularity will do nothing to
> achieve the humanistic goal that is the Singularity itself, and the
> people who insist that the Singularity is a religion will be right.
Producing and forwarding the meme is no small contribution. If the meme
catches fire the technologists and investors will follow.
At this point it is not at all clear that "humanistic" is applicable to
a Singularity that might well spell the doom of all humans. I don't
know what Singularity as religion has to do with this conversation one
way or the other.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT