From: Bryan Bishop (kanzure@gmail.com)
Date: Mon Oct 12 2009 - 08:57:51 MDT
On Mon, Oct 12, 2009 at 9:30 AM, Thomas Buckner wrote:
> This is why such discussions get sniped by the moderator. The whole sl4 forum is
> meant to further ideas about how to make sure the AGI (Artificial General Intelligence)
> is Friendly, i.e. that it doesn't do badly by us, even unintentionally. It's an insanely
> hard problem and some very smart people disagree about the best approach.
"... The SL4 mailing list is a refuge for discussion of advanced
topics in transhumanism and the Singularity, including but not limited
to topics such as Friendly AI, strategies for handling the emergence
of ultra-powerful technologies, handling existential risks (planetary
risks), strategies to accelerate the Singularity or protect its
integrity, avoiding the military use of nanotechnology and grey goo
accidents, methods of human intelligence enhancement, self-improving
Artificial Intelligence, contemporary AI projects that are explicitly
trying for genuine Artificial Intelligence or even a Singularity,
rapid Singularities versus slow Singularities, Singularitarian
activism, and more."
I'm ready to leave if we're all supposed to be on board the FAI train.
Is sl4.org wrong or not representative of what this list is actually
about?
- Bryan
http://heybryan.org/
1 512 203 0507
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:04 MDT