From: Matt Mahoney (matmahoney@yahoo.com)
Date: Fri Mar 13 2009 - 19:33:03 MDT
> Lastly, there is the issue of impact upon the future of humanity. ...
> It is bad because the human mind (at least my mind) finds it hard to
> cope with the immense cognitive dissonance that is created by this
> weight of responsibility, and the implication that there is a
> significant chance that the human race will be wiped out by someone's
> uFAI project.
1. AGI won't be developed in isolation. It will be an internet that keeps getting smarter, and harder to know when you are talking to a human.
2. Everyone, including criminals, will have access to AGI, just like everyone has access to email and Google.
...unless someone can show me a model of self-improving intelligence in a box rushing past the collective intelligence of humanity, I think the focus of our attention should be the collective intelligence of humanity.
-- Matt Mahoney, matmahoney@yahoo.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:04 MDT