From: Mike Dougherty (email@example.com)
Date: Sat Jan 14 2006 - 10:33:53 MST
desire - is often overloaded with emotional connotation, but if an AI were
developed to be driven by the aquisition of new information then it could be
discussed as a need/want/desire/compulsion/etc. to do so. In early
projects, it will probably be possible for the team of humans developing the
initial state to bias these directives. Once AI starts seeding their own
descendants, we may be examining the new creation with an approach similar
to anthropologists or sociologists.
interesting - there is a large volume of information/fact that can be
observed and compiled that has little use in advancing global thought. Much
of the minutiae becomes unimportant once a governing rule is discovered.
Sure, it would be "interesting" to note that for a process of sorting N
items that one algorithm achieves O(n) vs O(log n) and an AI that
understands the use of each is certainly interesting. Once those classes of
computing are discovered and understood, it would become "uninteresting" to
repeatedly test/prove the efficiency of one method or another using
increasing values of N.
I was really just making an observation about the fact that torture does not
have to be chemical. I think HAL from the movie 2001 may have expressed
greater 'concern' for his mental health (ability to function within optimal
parameters) as Dave Bowman turned off the HAL-equivalent of "higher" brain
function and memory. Given the scenario of forcing a self-aware neural net
to train on incorrect assumptions then being allowed to realize the
fundamental error is equivalent to mental abuse.
On 1/13/06, Phillip Huggan <firstname.lastname@example.org> wrote:
> Please define your terms "uninteresting" and "desire" as they relate to a
> knowledge seeking program. As I've admitted, if the program for some reason
> computes the need to physically engineer a mind from a substrate cabable of
> housing a brain, then all bets are off.
> *Mike Dougherty <email@example.com>* wrote:
> You really feel that chemistry is the only form of torture? If AI is
> driven by the desire (or "principle") to aquire new information and
> meaningfully relate that information to it's existing knowledge, then one
> form of "torture" might be to provide (for example) only reruns of 1970's TV
> sitcoms as input. Within a very short time frame, even moderately
> sophisticated AI will realize the underlying fundamentals are formulaic and
> computationally uninteresting. At some point that AI may actually seek
> death if ! it is the only end available.
> On 1/13/06, Phillip Huggan <firstname.lastname@example.org > wrote:
> > *Arnt Richard Johansen < email@example.com>* wrote:
> > I don't think a mind can be tortured without an endocrine system. I
> > would say that as soon as we start to involve chemical reactions or whatever
> > in our mind architectures, then we have to be careful not to piss it off.
> > I'm sure we have at least a decade to figure these things out, probably much
> > much longer.&nbs! p; No need to panic yet.
> Yahoo! Photos – Showcase holiday pictures in hardcover
> Photo Books<http://us.rd.yahoo.com/mail_us/taglines/photobooks/*http://pa.yahoo.com/*http://us.rd.yahoo.com/mail_us/taglines/photos/evt=38088/*http://pg.photos.yahoo.com/ph//page?.file=photobook_splash.html>.
> You design it and we'll bind it!
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT