RE: How hard a Singularity?

From: James Higgins (jameshiggins@earthlink.net)
Date: Thu Jun 27 2002 - 11:32:04 MDT


At 09:51 AM 6/27/2002 -0500, Stephen Reed wrote:
>On Thu, 27 Jun 2002, Ben Goertzel wrote:
>
> > As for the trustworthiness and ethical-ness of the US military, I guess
> > we're not likely to come to agreement, and there's no use to pollute SL4
> > with a long discussion of military history from 2 perspectives.
>
>Agreed.
>
> > But if the progress from here to the human-level-AI phase is slower
> than I'm
> > hoping/suspecting, you may be right. The US military has been by far the
> > greatest funder of AI so far.
>
>Yes, this summarizes my position:
>
>1. I believe that the current government institutions funding AI are
>sufficient to manage Seed AI development - and I trust them based upon
>personal experience and observation.

The problem, as I see it, has nothing to do with trust, honor or morality;
only purpose. I strongly believe the purpose a government or even a
company would have for an AGI is incompatible with the concept of
Friendliness. By nature such entities wish to promote and protect
themselves over all others. They, at least in the case of governments,
consider violence to be acceptable when they can't get the other party to
agree via debate. Attempting to create an AGI with these capabilities /
goals is highly likely to fail. The real problem is that it wouldn't
necessarily fail by not working, it could very easily fail by working but
not ending up Friendly. In such a case no one (at least human) would
benefit and most likely everyone human would suffer greatly. Because this
is the nature of governments and companies I don't expect that they would
realize this is the case (a few individuals might, but that would be too
late once the project was in motion). Do you see my point on this
issue? Will you at least give this serious thought and consider what might
happen if this were truly the case.

James Higgins



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT