From: Matt Mahoney (matmahoney@yahoo.com)
Date: Wed Feb 20 2008 - 13:57:32 MST
--- "Peter C. McCluskey" <pcm@rahul.net> wrote:
> There appears to be a serious lack of communication between people who
> think we're doomed without FAI and the people who expect a diverse society
> of AIs. It appears that the leading advocates of one outcome can't imagine
> how anyone could believe the other outcome is possible. This appears to be
> a symptom of a serious failure of rationality somewhere. I wish I could
> lock the leaders of each side of this schism into a room and not let them
> out until they either reached agreement or came up with a clear explanation
> of why they disagreed. Presumably part of the disagreement is over the
> speed at which AI will take off, but that can't explain the certainty with
> which each side appears to dismiss the other.
I think that both sides can agree that a singularity will result in the
extinction of humans in their present form and their replacement with higher
level intelligence. Where they disagree is whether this is good or bad. A
rational approach does not answer the question.
-- Matt Mahoney, matmahoney@yahoo.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT