From: Matt Mahoney (matmahoney@yahoo.com)
Date: Wed Feb 20 2008 - 16:30:25 MST
--- "Eliezer S. Yudkowsky" <sentience@pobox.com> wrote:
> Matt Mahoney wrote:
> >
> > I think that both sides can agree that a singularity will result in the
> > extinction of humans in their present form and their replacement with
> higher
> > level intelligence. Where they disagree is whether this is good or bad.
> A
> > rational approach does not answer the question.
>
> Both here and on the AGI list, you seriously have no idea what other
> people are thinking. Consider reading what they are saying, rather
> than making stuff up.
Sorry, yes there are people who think that a singularity will not result in
human extinction. But without brain augmentation, they will be unaware of it,
living either in a simulation or in a world so different that we could not
call them human. With sufficient augmentation to observe the singularity,
they would bear little resemblance to humans as we know them now. That is
what I mean by extinct. I know people will disagree.
-- Matt Mahoney, matmahoney@yahoo.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT