From: ben goertzel (firstname.lastname@example.org)
Date: Fri Apr 19 2002 - 13:30:07 MDT
What Eliezer means, of course, is that if an AI is sophisticated enough to
say "I" and "understand" and "do" with an understanding of what these terms
mean, then it is sophisticated enough to understand the question "What is
your name?" at roughly the same level that a human does, even if its
response is "I don't have one."
Naturally, it WOULD be possible for an advanced AI to achieve a state of
advanced consciousness in which it felt it didn't truly understand
ANYTHING, and hence to respond to any and all questions with an answer like
"I don't understand" or "That question has not Buddha-nature" ;-> The
"correctness" of a conversational response is after all a social rather
than a mathematical issue.
From: Eliezer S. Yudkowsky [SMTP:email@example.com]
Sent: Friday, April 19, 2002 1:20 PM
Subject: Re: Four Years Later.
Mike & Donna Deering wrote:
> Interrupting the computer, "What was the girl's name?" he asks.
> "What is your name?" he asks.
> "I don't understand." the computer responds.
"What is your name?" is a question to which an AI will never correctly
"I don't understand." "Error: Question incomprehensible" maybe, but not "I
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Fri May 24 2013 - 04:00:21 MDT