From: Charles D Hixson (charleshixsn@earthlink.net)
Date: Fri May 05 2006 - 00:37:57 MDT
Phillip Huggan wrote:
> Okay so the majority of people can't weight a decision tree properly.
> The majority of people aren't engineering AGI. Or is the point being
> made that an AGI entity would not fall into human traps if it attains
> administrative functions?
> ...
> *//*
I don't think that point was being made, and it shouldn't be.
Postulating the possibility of a General intelligence, I would presume
that any AGI would fall into it's own logical traps, not human ones. I
may think that intelligent computers are near, but I don't believe that
a "General" intelligence is feasible. Rather I expect it to be an
interaction of multiple specialized modules. And that EACH of these
modules will have its own weak points...not human ones. Presumably any
weaknesses that we have identified in ourselves will be fixed in the
high-level coordination module...but this probably means that there will
be new and different weaknesses. "The Deacon's Marvelous One-Horse
Shay" is a work of fiction, not of practical engineering.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT