From: Thomas Buckner (tcbevolver@yahoo.com)
Date: Wed Aug 24 2005 - 16:28:55 MDT
--- Phil Goetz <philgoetz@yahoo.com> wrote:
> The fear of UFAIs is based on the idea that
> they'll be able
> to outthink us, and to do so quickly.
>
> "More intelligent" thinking is gotten
> by adding another layer of abstraction onto a
> representational
> system, which causes the computational
> tractability of reasoning
> to increase in a manner that is exponential in
> the number
> of things being reasoned about. Or, by adding
> more knowledge,
> which has the same effect on tractability.
>
> By limiting the computational power available
> to an AI to be
> one or two orders of magnitude less than that
> available to a
> human, we can guarantee that it won't outthink
> us - or, if it
> does, it will do so very, very slowly.
> There are many cases where someone has come up
> with a new
> algorithm that has lower computational
> complexity than the
> previously-known algorithm, but I don't think
> any algorithm
> will be found for general intelligence that
> doesn't have the
> property that exponential increases in
> resources are needed
> for a linear increase of some IQ-like measure.
>
> If the AI gets out and is able to harness the
> computational
> power on the internet, that would be different.
> But within
> its box, it's going to remain at or less than
> the order of
> magnitude of intelligence dictated by its
> computational capacity.
>
> - Phil Goetz
Yes, it occurred to me a while back that one
could set up a carrot/stick system simply by
giving the AI more computational power or taking
it away; presumably it would only be released
when it behaved according to some standard such
as maxing out its rewards (at whatever ceiling
had been preset) and incurring no demerits for
some length of time. But I doubt such a simple
system could guarantee a Friendly result. An UFAI
might simply become a 'model prisoner' and wait
you out.
Tom Buckner
____________________________________________________
Start your day with Yahoo! - make it your home page
http://www.yahoo.com/r/hs
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:52 MDT