From: Philip Sutton (Philip.Sutton@green-innovations.asn.au)
Date: Mon May 24 2004 - 10:54:54 MDT
Hi Ben,
> So far as I can tell, the processes leading to superintelligent AI are
> moving much faster than the processes leading to the global maturation
> of humanity, leading me to suspect that in fact the former will come
> first.
At one level that's surely correct. There's no doubt that the technology
development is moving ahead apace. But the trouble is that we don't
just want AGI - we want friendly AGI and in a world of 6 (and rising)
billion people who are not always all that cooperative and friendly it
might be that not very friendly human influences mean that there's a
high chance that the first AGIs to gain access to huge computer
platforms are not designed or brought up to be friendly.
So in that sense I think Jef has a point that we need to get our human
act together very soon.
But my guess is the that getting our act together amongst humans
need not be a 1000 year project.
I've been reading a book by a nonviolence researcher, Gene Sharp,
whose works have been used in bringing down dictatorships in Poland,
Serbia and several other places. The US spent $1000m bombing the
shit out of Serbia (which given what the Serbian military had done to
their neighbours and to the local muslim population might have only
seemed fair). But the investment failed to dislodge the Milosovic and
alienatd many people. Then some parties in the US and Europe put
about $15m into supporting a nonviolent movement and the dictator
was gone in 1 year.
My reason about rabbitting on about all this is that when sensible ideas
are applied to the change process, humans can achieve a lot more
than many of us expect. A bit more investment of time and money in
these human development directions would yield great dividends. And
modest-level AGIs might be able to help with these directions.
Doing that might actualy improve the prospects of creating friendly
AGI.
Cheers, Philip
This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:22:36 MST