Re: Investing in FAI research: now vs. later

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Feb 20 2008 - 14:51:44 MST


Matt Mahoney wrote:
>
> I think that both sides can agree that a singularity will result in the
> extinction of humans in their present form and their replacement with higher
> level intelligence. Where they disagree is whether this is good or bad. A
> rational approach does not answer the question.

Both here and on the AGI list, you seriously have no idea what other
people are thinking. Consider reading what they are saying, rather
than making stuff up.

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT