From: James Higgins (jameshiggins@earthlink.net)
Date: Tue Jun 19 2001 - 13:42:03 MDT
True, many things could happen. But the only things that would cause an SI
not be developed eventually would completely wipe out all intelligent life
on earth. Thinking about this issue for a period of time makes me really
believe that there are only 2 stable states of intelligence: non-existent
and SI. After developing sufficient intelligence, it has taken humanity an
extremely short duration to progress to a point where it is nearly feasible
to create an SI. Thus, it is reasonable to believe that any reasonably
intelligent society would also progress to the same point. Assuming, of
course, that curiosity and ambition (or other stimuli that cause them to
develop technology) are present in the society.
Of course, this further leads me to believe that either we are the only
intelligent beings in the universe or SIs already exist.
Unfortunately, I don't believe we have any chance of really creating a
"Friendly AI". Eliezer pointed out that even his nuclear black box
scenario must have missed something that would be obvious to an SI but not
to us (he used Neanderthal for comparison). So to think that we could
really create "Friendly" AI when we can't even devise a safe method to talk
with a questionable SI is ludicrous. We will miss something, which will
either cause the SI to shed its Friendly nature (not to say that it would
be hostile) or cause it to be deranged in some manor. Personally, I am
more worried about a deranged FAI being developed than an Non-Friendly AI.
At 01:35 AM 6/19/2001 -0700, Samantha Atkins wrote:
>You missed my point. This dip did wipe out a lot of high-tech
>companies, including at least one strong AI effort, and halted
>the creation of many more. Sometimes I wonder if it was by
>accident or part of a ploy to slow things down for a bit
>longer.
>
>Also, I think it is a serious mistake to think that
>technological singularity is inevitable. A serious cultural
>revolution, a world war, America being overrun by a repressive
>fundie government, changing the code of the internet enough to
>make it a tool of massive regulation and oppression - these and
>other things could wipe out a lot of our most cherished dreams
>and assumptions for at least our lifetimes and that of our
>children.
>
>
>- samantha
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT