Re: How to make a slave (was: Building a friendly AI)

From: William Pearson (wil.pearson@gmail.com)
Date: Mon Nov 26 2007 - 16:40:36 MST


On 26/11/2007, Jeff Herrlich <jeff_herrlich@yahoo.com> wrote:
> The problem here, John is that you don't understand anything about what you
> are talking about. You are anthropomorphising the living hell out of the AI.
> Your internal conception of AI is now in a FUBAR condition. Do you
> understand that if we don't direct the goals of the AGI, it is a virtual
> *CERTAINTY* that humanity will be destroyed; and that the AGI will likely be
> stuck for eternity pursuing some ridiculous and trivial target?

While I don't agree with John's argument, this "fact" can't remain
unchallenged. Do you have some secret theory of intelligence that
allows you to make such statements?

I currently think any adult brain (silicon or otherwise) will be
fairly opaque to minds of the same complexity. I think intelligence
requires experimental self-programming and that many hacks and weird
dependencies between brain modules will develop as a mind comes to
maturity, while a baby brain would be conceptually simple. This would
restrict the ability to self-improve to something slow and painful.
Lets say parts of the AI develop to use the natural response times of
the components for timing, I have seen things suggesting that neurons
are used in this fashion in brains (it might be a lot more efficient
than centralised clocks). So then speed up the hardware may throw the
timings out of sync, if the memory/logic is ported blindly. And
similarly for speeding up an algorithm might throw it out of sync with
other algorithms. An AI would have to understand itself completely to
get speed up from hardware speed increases in the trivial fashion
assumed in the exponential growth of a singleton.

I do not seek to convince you of this. Just to show your virtual
certainty is misplaced, unless you can prove that intelligence does
not need this sort of system. Because with this sort of necessity for
AI it means that societies are likely to be the general mode that
brings about the singularity (if it occurs), rather than a singleton.

  Will Pearson



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:01 MDT