From: Charles D Hixson (charleshixsn@earthlink.net)
Date: Tue Aug 29 2006 - 14:42:45 MDT
Tennessee Leeuwenburg wrote:
> John K Clark wrote:
> ...
>>
>> A very powerful AI is the very definition of the Singularity.
> Not on my understanding of it. The Singularity implies a continual
> improvement. A very powerful AI might choose not to continually
> improve itself. A very powerful AI might not be beyond our understanding.
>
> Cheers,
> -T
A very powerful AI isn't the definition of the Singularity, but it's one
route to it. Probably the most likely.
The Singularity doesn't imply a continuously increasing rate of change
for all time. It does imply the the rate of change increases to a very
high level (relative to today) at some point. Probably 1,000 times the
current rate of change would be effectively a singularity as far as
humans were concerned. Nobody can really predict what things are like
on the other side of the "Schwartschild radius" analog, so no firm
beliefs are reasonable. Perhaps we all revert to the stone age and
start over. You just can't say. That's not how we normally predict,
but the very definition of "The Singularity" implies that we can't make
reasonable predictions"
What we CAN do is attempt to establish what we hope will be favorable
initial conditions. We don't know whether it's chaotic or not, so there
are lots of different approaches. Friendliness is one reasonable
approach. I personally believe that attempts should be made with
several slightly differing definitions of Friendliness in the hope of
creating attractors if the process should turn out to be chaotic.
I also believe the entire process is extremely dangerous. Just not as
dangerous as leaving the world in the charge of maniacs who think that
being a bully with an H-bomb and an army is a great idea. (Also I don't
see any feasible way of avoiding the singularity, so the best we can do
is try for the best outcome. Many others are more optimistic.)
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:57 MDT