From: Robin Lee Powell (email@example.com)
Date: Fri Jun 09 2006 - 12:41:29 MDT
On Fri, Jun 09, 2006 at 02:22:15PM -0400, Martin Striz wrote:
> On 6/7/06, Robin Lee Powell <firstname.lastname@example.org> wrote:
> >> BTW, they will also probably be the most complex thing that
> >> they have to deal with and learn about. It is logically
> >> impossible to contain a complete internal model of yourself.
> >Why do people keep saying that?
> That's an interesting gimmick, but a quine has no internal model
> of itself. My point was that, just as we often can't predict our
> own future actions because we are oblivious to the substrate level
> action of our minds, an AI won't be able to simultaneously model
> of its substrate level activity, so there will be some lack of
> information, and some error.
The fact that *humans* can't do it proves nothing. We have very
small, fragile, and imperfect memories, for one thing.
-- http://www.digitalkingdom.org/~rlpowell/ *** http://www.lojban.org/ Reason #237 To Learn Lojban: "Homonyms: Their Grate!" Proud Supporter of the Singularity Institute - http://singinst.org/
This archive was generated by hypermail 2.1.5 : Mon May 20 2013 - 04:01:04 MDT