From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri Jul 27 2001 - 23:12:34 MDT
Evan Reese wrote:
>
> It is certainly a hell of a lot more interesting than this uninspired
> fear-based seed AI thing.
Note to self - remember to maintain the balance, and not to leave out the
part about having fun.
I am fundamentally an optimist. I think that humanity has a happy future
ahead of it. But I'm a very nervous optimist. I think that this future
has to be protected. Not just "protected", in fact, but implemented.
The quest for the Singularity is not fear-based. I see a successful and
beautiful Singularity as the ordinary path for humanity to take, and the
other paths as the distortions. I'm not afraid of what will happen, but
what won't. So I'm fundamentally an optimist, but a very nervous
optimist.
Transcendence via successfully Friendly seed AI or persistently altruistic
upload should take us to exactly the same place. Seed AI is just faster,
that's all.
I've definitely toned it down *much* too far. Go read "Staring into the
Singularity" ( http://sysopmind.com/singularity.html ). If you still
think that's uninspired, or fear-based...
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT