From: Jimmy Wales (jwales@bomis.com)
Date: Wed Aug 01 2001 - 12:27:35 MDT
Joaquim Almgren Gāndara wrote:
> Seriously speaking, you didn't adress the other possibility. What if
> it needs to be seven times as smart as a human in order to improve its
> own code? Let's assume that there is no codic cortex. Let us also
> assume that Ben or Eli manage to create a human-level AI. What if it
> looks at it own code, just goes "Oh, wow, you've done some really cool
> stuff here" and then ~can't~ improve the code? If it takes two or more
> ~intelligent~ people to create an AI equivalent to the ~average~
> human, what's to say that the AI can create a ~trans-human~ AI? Isn't
> that a leap of faith?
At this point, Intel starts churning out billions of these things and they
all get to work in thousands of different areas of science, quickly sharing
results that they find with each other.
If even billions and billions of scientists, working together in an incredibly
high speed networked fashion, working 24/7 can't achieve anything more intelligent
than we were able to do, with only several million scientist man-hours, I'll be
very surprised. Wouldn't you?
--Jimbo
-- ************************************************* * http://www.nupedia.com/ * * The Ever Expanding Free Encyclopedia * *************************************************
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT