Re: [sl4] Is there a model for RSI?

From: Krekoski Ross (rosskrekoski@gmail.com)
Date: Sun Jun 15 2008 - 20:53:41 MDT


Oh, I like the this post!

On Mon, Jun 16, 2008 at 4:18 AM, Matt Mahoney <matmahoney@yahoo.com> wrote:
>
>
> Arguments in favor of RSI:
> - Humans can improve themselves by going to school, practicing skills,
> reading, etc. (arguably not RSI).

Yes, not exactly RSI, I would just say its assimilating external complexity.

>
> - Moore's Law predicts computers will have as much computing power as human
> brains in a few decades, or sooner if we figure out more efficient
> algorithms for AI.

I've always wondered about moore's law. It may be exponential as long as
computing architecture complexity does not rival some function of collective
or relevant human complexity. Of course the two are not strictly distinct
however.

>
> - Increasing machine intelligence should be a straightforward hardware
> upgrade.

I'm not sure. We currently do not really treat calculations / second as a
determiner of IQ. It may not be strictly qualitative. Also keep in mind that
the interior complexity of our brains is describable with a very low
complexity (some subset of whats in our DNA plus the contextual complexity
of the cellular environment) with respect to acquired knowledge. I suppose
though that the term 'hardware upgrade' isn't really overly specific in that
sense. There should be a correlation of intelligence with a function of the
density of interconnections in the hardware and the total complexity of the
information that is stored in the hardware. Again though, as I've mentioned
before, I dont know if distinctions such as 'hardware/software' or 'RAM/HD'
or even 'storage/processing' we often make here are all that applicable to
our own minds.

>
> - Evolution produced human brains capable of learning 10^9 bits of
> knowledge (stored using 10^15 synapses) with only 10^7 bits of genetic
> information. Therefore we are not cognitively limited from understanding our
> own code.

If you mean our 'code' as in our genetic information, I would agree with
you. If you mean our 'code' as in the sum total of genetic + acquired
knowledge, I may have to disagree. I cannot simulate myself in my own brain,
and if I could it wouldnt be a real simulation since I would need to also
simulate myself simulating myself etc.

Ross



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT