From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Sep 04 2000 - 23:04:03 MDT
Samantha Atkins wrote:
>
> Even granted that this Power is a much higher sentience, I still feel as
> if I am betraying humankind, betraying my own primary motives in working
> to bring it about sometimes. How do the rest of you deal with this?
> What am I missing?
>
> I know that the Singularity is eventually inevitable for some
> intelligent species and inevitable for us barring major disaster or some
> totally unforeseen bottleneck. But how can I be in a hurry to bring it
> about and still claim I work for the good of humanity?
As far as I can see, you've just answered your own question. It *is*
inevitable. This is just a question that we, as a species, have to confront
at some point. The sooner we confront it, the less chance we have of being
wiped out by something else, like nuclear war or military-grade
nanotechnology.
The moral argument for building a Power for its own sake would only argue in
favor of building a Power sometime within the next few million years. The
reason I want to do it tomorrow is, one, because there are 150K deaths every
day we wait, and two, to increase the chance that humanity survives. I have
no moral qualms about either of those goals. There doesn't seem to be a
conflict of interest.
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT