From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Jun 30 2002 - 11:54:52 MDT
Ben Goertzel wrote:
>
> And I don't see any way to make reasonably solid probability estimates about
> *any* of these risks... the risk of the human race bombing itself to
> oblivion OR the risk of a certain apparently friendly Ai going rogue...
Sigh. Well, I just want to point out for the record that although it
certainly *sounds* very reasonable, wise, and openminded to say that
your estimates are just intuitions and they could be wrong and there's
probably no way to get a good picture in advance, if you *go ahead and
do it anyway*, it's not really any less arrogant, is it?
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT