From: Gordon Worley (redbird@mac.com)
Date: Wed Aug 13 2003 - 08:52:25 MDT
On Wednesday, August 13, 2003, at 12:24 AM, Nick Hay wrote:
> Sorry, that wasn't a clear question. Of course there are multiple AGI
> efforts,
> and people probably won't want to shut down "their" project to work on
> "someone else's".
>
> Let me rephrase: Should we try and synchronise a bunch of promising AGI
> projects to ensure a community of them will exist at any one point?
> Should we
> fork existing projects for diversity? Should AGI projects try to
> develop a
> community of AGIs rather than a single one?
Many AGI projects is, in my opinion, a bad idea. Each one is more than
another chance to create the Singularity. Each one is a chance for
existential disaster. Even a Friendly AI project has a significant
risk of negative outcome because Earth has no AI experts. Rather we
have a lot of smart people flopping around, some flopping in the right
direction more than others, hoping they'll hit the right thing. But no
one knows how to do it with great confidence. It could be that one day
10 or 20 years from now the universe just doesn't wake up because it
was eaten during the night.
Each AGI project is a chance for failure. Even if you manage to create
a project that has a chance of creating the Singularity, it has a good
chance of going wrong. And just one killer AI is enough. You can't
suppress it with 10 other `good' AIs; the one killer just wipes out
everything in a couple minutes before anybody can respond. It's over
before it even began.
We should try to limit our points of failure, not increase them. The
fewer AGI projects, in my opinion, the better, because we will be able
to focus more resources on it (intelligent people, money, etc.) rather
than spreading the resources thin to create a lot of half-ass projects
that have a greater chance of existential disaster than one well backed
project.
-- Gordon Worley "It requires a very unusual mind http://www.rbisland.cx/ to undertake the analysis of redbird@mac.com the obvious." PGP: 0xBBD3B003 --Alfred North Whitehead
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT