Re: Hiding AI research from Bad People was Re: OpenCog Concerns

From: J. Andrew Rogers (andrew@ceruleansystems.com)
Date: Tue Mar 25 2008 - 02:56:00 MDT


On Mar 25, 2008, at 1:12 AM, William Pearson wrote:
> On 25/03/2008, J. Andrew Rogers <andrew@ceruleansystems.com> wrote:
>> Non sequitur. And in any case, I do not assume a FOOM scenario;
>> indeed, I have a history of arguing quite the opposite. It does not
>> invalidate my original point, which was that nobody will be paying
>> attention until such a time that tripping over it is unavoidable
>> unless you put a lot of effort toward attracting attention. This
>> argument is not dependent on the speed of the scenario, though
>> outcome
>> distributions do vary a bit with scenario speed.
>>
> I'm missing something then, what do you think will make the
> government, "a day late" if someone apart from them creates AI? A day
> behind perhaps, but if you aren't expecting a discontinuity, then what
> exactly makes them late?

To put it simply, it is a Red Queen's Race, and for a variety of
technical reasons no one is likely to catch you after a certain point
is crossed in development that should occur pretty early on and/or
under the noise floor of notice. The curious thing is that in an
effectively closed system this has a high probability of being true
for a very broad range of takeoff speeds. Whether it takes a day or a
year, any interdiction will likely show up too late though for
slightly different predominant reasons. It should be obvious with
some consideration, being a mixture of complexity scaling, cognitive
biases, and latency. It is not a *certainty*, just probable. Even if
they are copying an immature system, they inherit its scaling
characteristics but any time delay imposed by implementation or
improvement imposes extreme costs with respect to catching the
original system in terms of intelligence, such that throwing hardware
at the problem faster may not significantly help.

Given the current landscape, I find it improbable that anyone will be
paying close attention until well after the critical point where
another competitor can enter the race with any probability of
success. By the time they try to steal your AI, they will not be able
to unless you expended considerable effort to make it a possibility.

J. Andrew Rogers



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT