Re: Threats to the Singularity.

From: Aaron McBride (amcbride@jps.net)
Date: Thu Jun 13 2002 - 12:15:04 MDT


At 10:06 AM 6/13/2002 -0700, you wrote:
>At 10:47 AM 6/13/2002 -0500, you wrote:
>><clip>
>>
>>#2. One of a vast number of midnight hackers who have been studying AI
>>design over the internet hack together the first real AI. The only way
>>for one of the AI development teams to beat the hacker is to do it before #1.
>> <clip>
>>Mike.
>>
>
>
>I was under the impression that this is about as likely as someone being
>able to "hack together" a 747 in their back yard. Building Real AI is
>probably not simple (we'd probably have stumbled on it by now if it
>was). I doubt anyone will be hacking together a Real AI on their own
>anytime soon. Anyone have estimates of the complexity of a seed AI?
>
>-Aaron
>

Just to clarify -- by "anytime soon" I was thinking the next 5-10
years. Beyond that I expect the probability of humanity destroying itself
to rise quickly.

Also... I assume that it took more than 10 (hu)man-years for either MS Word
or MS Windows to be created (???), so neither of them could have been
hacked together by one person in the next 5-10 years.

The key then is to:
a) Have groups larger than 1 working on AI projects (the larger the better
as long as everyone is being effective).
b) Build better software engineering technology so that we're able to
manage the complexity of the Seed AI.

I'm sure I'm just echoing what's been said on this list for years. To
return to the topic... the threats to the Singularity are anything that
interferes with 'a' and 'b' above.

-Aaron



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT