From: James Rogers (firstname.lastname@example.org)
Date: Wed Aug 13 2003 - 11:22:44 MDT
Gordon Worley wrote:
> Many AGI projects is, in my opinion, a bad idea. Each one is
> more than another chance to create the Singularity. Each one
> is a chance for existential disaster.
Heh. That is a rather optimistic perspective of most AGI projects. Virtually
all "AGI projects" are something to keep monkeys busy and have no particular
relevance to AI, the Singularity, or anything like that. They like to think
they do, but it isn't grounded in reasonable fact.
I think it is safe to say that statistically the vast majority will never
present a danger of making a credible AGI. In all likelihood there is probably
only one pragmatic solution to AGI surrounded by a space of credible close
approximations. Looking at the current project landscape, this would mean that
you will never have to worry about more than one or two projects. The problem
therefore boils down to project selection, and theoretically studious
individuals can probably make some solid guesses in that regard.
So I say bring them on. Let as many people start AGI projects as want to. It
won't amount to much generally, and you can't possibly tarnish the image of the
field moreso than it already is.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT