From: Gordon Worley (firstname.lastname@example.org)
Date: Mon Sep 23 2002 - 19:03:08 MDT
On Monday, September 23, 2002, at 05:42 PM, Samantha Atkins wrote:
> I don't think the transition guide is the problem. What may be *if* we
> get to Singularity and *if* it is Friendly is not what I am most
> concerned about today or in the immediate future.
If there is no Singularity and there is no Friendly AI then humans are
as good as dead. The kind of changes that you want to see are only
partially achievable at best. Making the populace more Rational is not
merely a memetic battle; you have to get people to actually fix their
minds on their own. Even then there is only so much fixing that can be
Maybe you'll figure out how to succeed where Jesus and Siddhartha and
Ghandi failed. More likely you won't.
This is not to discourage you. If you're successful enough before the
Singularity maybe you can save enough people that you did better than if
you contributed more to creating the Singularity. You have a better
idea than I do of what you are capable of and what you can contribute
where. I just hope that you will make the best choice.
-- Gordon Worley "Man will become better when http://www.rbisland.cx/ you show him what he is like." email@example.com --Anton Chekhov PGP: 0xBBD3B003
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT