From: Eliezer Yudkowsky (sentience@pobox.com)
Date: Mon Jan 24 2005 - 14:10:10 MST
Ben Goertzel wrote:
>
> We know with great detail that Cyc, SOAR and GP (to name three AI
> systems/frameworks) will not result in an AI system capable of hard takeoff.
>
> And, we know this with MORE certainty than we know that no one now knows how
> to build a ladder to Andromeda.
If by GP you mean genetic programming, then natural selection coughed up
human intelligence starting with considerably less sophistication (and
certainly less direction) than at least some modern-day GP frameworks.
But I suppose that two out of three ain't bad for the objective calibration
of absolute certainty. It's more or less the standard experimental result
on tests of human overconfidence.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:50 MDT