From: Chris Cooper (email@example.com)
Date: Fri Apr 06 2001 - 17:12:45 MDT
"Christian L." wrote:
>In this context, "explaining these ideas to laypeople" is not very desirable
IMO. The less the general public knows, the more likely we are about to
reach Singularity. We don't need a huge number of followers. We only need a
few brilliant programmers.<
And you're gonna teach a seed AI about Friendliness? Good Luck, pal!
Seriously, I think that you do have some very valid points about the publics'
inability to accept sudden technological change. However, that's due to
ignorance on their part. Keeping them ignorant is a perfect example of
expecting two wrongs to make a right. Also, when the public hears scientists
say, "shut up and eat this, it's for your own good." who can blame 'em for
>What we are planning on this list is to create a machine that will literally
take over the world.
Do you really expect the general population to like this idea? I think not.
We are talking about The End of the World As We Know It.
I expect about 99+ % of the population to be opposed.<
Didn't Hitler give this speech right before he annexed Poland? (KIDDING!!!) Of
course the general public isn't going to swallow this if it's served up like
that! Convince them that the Singularity means an immeasurably better existence
for everyone on the planet, and I think that they might like it a bit more.
>You are free to call me an elitist, since I feel that this decision is best
made by the elite (us) than by the uninformed masses.<
You may very well be right, but I feel VERY uncomfortable with that statement,
especially from someone that is supposed to program an AI to not act selfishly.
How can you teach these concepts if you don't believe in them yourself?
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT