Re: Ben vs. Ben

From: Brian Atkins (brian@posthuman.com)
Date: Sat Jun 29 2002 - 23:45:45 MDT


Ben Goertzel wrote:
>
> The idea that Novamente has more potential to *ever* be smarter than
> Microsoft Word is *also* "just my opinion"... or rather, "just the opinion
> of me and the others who have studied the codebase"
>
> Can't you see that if the odds of a certain software system going superhuman
> are *sufficiently low*, then no protective measures are necessary, or even
> meaningful?

Certainly, although last I checked M$ Word wasn't self modifying. That's
the point all my devil's advocacy is about: if you get to the point where
you have running self modifying code, you should already have in place
plenty of safety measures.

>
> I could give you a long list of other people with would-be-AGI systems:
> Peter Voss of A2I2, Pei Wang, Cyc,.... All these folks also have incomplete
> would-be AGI systems, and all these folks also assess that their systems
> have effectively no chance of going superhuman until much further coding
> work is done on them.
>
> I guess the reason you're pushing me on this issue, and not them, may partly
> be that you suspect I have a slightly higher chance of success than these
> guys. So I should be flattered.... I also think I have a higher chance of
> success than these guys. But my feeling that I have a higher chance of
> success than these others, though quite strong, is much, much weaker than my
> very solid knowledge that the current code base *cannot go superhuman*.
>

Actually, no the reason is just that you're around and willing to talk
about it. Which I do have to give you a lot of credit for. I'll be even
more impressed if you do eventually get out your "social policy" for
criticism well before you get your total codebase running. I believe anyone
of them, you, us, or anyone else working on this stuff needs to have concrete
well-documented plans for how to deal with these issues well before they get
near the point of actual testing. If one of them was here instead of you I'd
be pressuring them just the same... hopefully they are getting the message
by lurking.

I am still concerned that your commercial focus may be causing you to
cut dangerous corners in the long run, hopefully you'll address that in
your eventual documentation. Have a good vacation! :-)

-- 
Brian Atkins
Singularity Institute for Artificial Intelligence
http://www.intelligence.org/


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT