From: DarkVegeta26@aol.com
Date: Fri Mar 01 2002 - 13:07:47 MST
WARNING: Relatively long post. Only read if you care.
Thank you Eliezer, for striving to maintain a reasonable level of sanity on this apparently insanity prone list. I've read a lot of crazy misinterpretations of the Singularity concept, but really, I must say this confounded me;
<snippity>
Now that Technological Singularity has arrived in the form of http://www.scn.org/~mentifex/mind4th.html -- Robot Seed AI --you all deserve this big Thank_You for your successful work. If you doubt the Grand Claim of the arrival of the Singularity, consider and observe the current ramp-up of the Artificial Mind.>>
<snip.>
And to Jason Duncan:
Welcome to the list. Save your sense of humor for later, when you understand the concepts behind this list and the tone that is acceptable. Please spell correctly, please use punctuation, please do not type sentence fragments, or meaningless posts. They hurt my eyes. But still, welcome to this list, and I look forward to insightful and intelligent posts from you in the future.
Ben vs. Eliezer issue:
<snippity>
Eliezer has some deeply-felt and deeply-thought-out but rather extreme attitudes about how we should approach bringing the Singularity about, on a personal and ethical level.
<snip.>
The Singularity itself is rather extreme, I dare say, and extreme thought on the personal and ethical level seems to be in order.
<snippity>
Eli = "100% Singularity dedication"
versus
Ben = "Powerful Singularity dedication plus some ordinary human attachments"
<snip.>
Ben, if you had become a Singularitarian *before* you fell in love with anyone or had kids, what would you have done? Is this an inappropriate question? Why must society places constraints on what questions are appropriate, when 'unappropriate questions' are often the most important questions of all. Not this one, necessarily. I'm just saying in general. The "Would you kill your children for the Singularity" one is a good example. It's probably best not to yak about such things just for memetic purposes.
<snippity>
However, I do not feel that the kind of 100% altruistic Singularity-dedication that you describe is
necessary in order to work effectively and beneficially toward Real AI and the Singularity.
<snip.>
It probably helps. Also, a totally evolved fashion of thought to go along with a totally evolved idea seems good, though I'm not declaring anything resolutely, of course, and I have no problems with people who are working towards Real AI even if they aren't dedicated to the Singularity.
<snippity>
Under very many circumstances, I would decide that a Singularity that required me to murder
my children wasn't the right kind to bring about anyway....
<snip.>
More interesting, totally hypothetical scenario: It is the year 2009, and Eliezer Yudkowsky and Ben Goertzel are sitting at the input terminals of a supercomputer, right next to each other, sipping cups of tea. They are watching the recursively self-improving AI's progress on a holo-display, and it is nearly at the level critical to acheive an unstoppable Singularity (it is working out the final logistics of nanotechnology, for example). Suddenly, an evil psychotic anti-Singularitarian busts down the door, holding a device with a big red button that says "Doomsday Device" on it. He declares that he has nuclear bombs rigged all over the planet, and he hates computers so much, that he'd rather have everyone die than be 'forced to live inside of one'. Eliezer and Ben both have customized magnums in slick holsters at their sides. What do they do? WHAT DO THEY DO!
Point: Sometimes, when it's at the last second, ends do justify the means. The question is, what do you call the "last second"? I say that its when the possibility of a negative memetic backlash becomes negligable...but its really hard to say where that is.
*trips slightly on his black hooded cloak before pacing off*
Michael Anissimov
ComputroniumShockwave.cjb.net
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT