From: Sam Kennedy (sam_rk@bellsouth.net)
Date: Tue Apr 02 2002 - 22:21:52 MST
What's needed for true, singularity-quality AI isn't more people to make a new AI engine. What's needed is for a new
innovation to be made. That one innovation which will push AI over the edge we all await. It could also be an entire
system of innovations. What about this:
1) Make a AI system which can modify itself
2) Make a system which can judge how "human", or aware, another entity is
3) Use input from system 2 to better system 1.
System 2 is the most important one, according to the enormous authority of my random thoughts. Something that needs to
be made is something basically the reverse of the Turing Test.
-- Sam Kennedy 4/2/02 11:09:03 PM, Nathan Russell <nrussell@acsu.buffalo.edu> wrote: >At 09:50 PM 4/1/2002 -0500, "Eliezer S. Yudkowsky" <sentience@pobox.com> wrote: > >>Starting on real AI takes programmers working together in the same physical >>location, full time, and extended orientation sessions that would suck up >>tremendous time if I had to repeat them independently for each recruit. > >I might point out that Linux was started by a full-time student who had met >essentially none of the folks he ended up working with. It's still the >case that the majority of the folks in kernel development are occupied >elsewhere full-time, and meet only possibly at infrequent conferences, etc. > >Granted, AI is more complex than Linux, but how much so? > >Nathan > >
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT