From: Matt Mahoney (matmahoney@yahoo.com)
Date: Sat Nov 29 2008 - 12:38:48 MST
--- On Sat, 11/29/08, Philip Hunt <cabalamat@googlemail.com> wrote:
> > All programming languages are Turing complete. You can
> convert from any one to any other with a program that is
> insignificantly small compared to the 10^17 to 10^18 bits of
> knowledge needed for AGI.
>
> You don't need to explicitly put that much knowledge in; the human
> genome is about 7e8 bytes so clearly that is enough. Then you just let
> the AI learn until it is fully intelligent.
A baby AI still has to be trained to bring it up to the 10^9 bits of long term memory in a human adult. We consider this cost negligible because you only have to do it once, then make copies. However, before these AIs can start working, they each need training customized to their job, for example, the roles and skills of their coworkers, customers, and vendors. Organizations are optimally efficient when the skill sets of their members have minimum overlap. Based on the cost of replacing an employee, I estimate that 1% to 10% of your knowledge is not known by anyone else, or 10^7 to 10^8 bits per person. You need to do this 10^10 times. Only a tiny fraction of this knowledge is on the internet. The rest has to be extracted from human brains at a rate of 2 bits per second per person. That is what makes AGI expensive.
> You are making the incorrect assumption that I have a design for AI. I
> don't. Although I do have a few ideas about how AI could be achieved.
Then why worry about programming languages at this point? People are most productive in the languages they already know. That's why I suggested natural language. It is probably 100 to 1000 times faster than any programming language.
-- Matt Mahoney, matmahoney@yahoo.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT