From: Michael Anissimov (anissimov@intelligence.org)
Date: Thu Nov 17 2005 - 01:37:47 MST
Phillip Huggan wrote:
> SIAI has chosen not to go open-source likely for a very good reason,
> so "AGI schools" are not feasible. Aptitude tests have their own
> biases. Couldn't one or more AGI tests be brainstormed and written by
> active AGI researchers or administrators to offer some measurable AGI
> skill ranking of AGI research candidates?
The following is one problem that FAI programming candidates might be
directed towards:
http://sl4.org/bin/wiki.pl?SimplifiedFAI
Although the focus in this problem is FAI, it is AGI-relevant as well.
With regards to all the posts on searching for candidate programmers -
the pickings are obviously relatively slim. Emil Gilliam has suggested
contacting high-scorers in national math and programming competitions.
Note that A2I2, despite informing thousands of people in the
transhumanist community, has not yet received responses to their queries
for AGI programmers (to my knowledge). At least one person I know of,
Matt Bamberger, started independent work on AGI this year. Perhaps a
few dozen qualified people worldwide have started working on the problem
part-time in the year 2005. The easiest strategy seems to simply be to
point as many people as possible in the direction of Singularity-related
literature. If the push towards Singularity seems unjustified or
unrealistic to them, after having read an explanation of the whats and
the whys, then they're most likely a lost cause, and will not be
contributing to the creation of AGI within relevant timeframes. I agree
with Michael Vassar that we have an unfortunate tendency to overestimate
the average intelligence of human beings. The Internet allows us to
selectively isolate ourselves within relatively high-IQ communities.
-- Michael Anissimov http://intelligence.org/ Advocacy Director, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:53 MDT