From: Michael Wilson (firstname.lastname@example.org)
Date: Wed May 19 2004 - 12:44:33 MDT
>> any black-box emergent-complexity solution is to be avoided
> Thatís equivalent to saying never make an intelligent machine because
> youíll never understand a mind greater than your own.
No, it isn't, due to the magic of chunking. Once you've designed a
given operator, you can reduce it to its preconditions and postconditions.
You can then to proceed to combine operators using formal structures that
produce compound preconditions and postconditions, including ones that
may have an arbitrarily complex internal structure when instantiated
(e.g. the state graph of a running search algorithm). It is perfectly
possible to produce a system too complex for you to understand in detail
without giving up and starting to try things speculatively without being
able to scope their consequences. Positive safety requires that you
prove that every operator won't do something unpleasant (by checking that
the range of possible results falls entirely within acceptable bounds)
before you execute it.
> It would only take a few seconds to write a computer program to find a
> even number greater than 4 that is not the sum of two primes and then
> stop, but what will the machine do, will it ever stop?
This is an example of the distinction between acceptable and unacceptable
uncertainty. We don't know if the program will terminate, but we know that
it won't suddenly start modelling the programmer or rewriting its own code
or converting the world into grey goo.
* Michael Wilson
Yahoo! Messenger - Communicate instantly..."Ping"
your friends today! Download Messenger Now
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:46 MDT