From: Martin Striz (mstriz@gmail.com)
Date: Wed Jun 07 2006 - 14:32:13 MDT
On 6/6/06, Eliezer S. Yudkowsky <sentience@pobox.com> wrote:
> Did you read the book chapter?
Yes. I think that "coding an AI that wants to be friendly, that
doesn't want to rewrite its code" is a semantic evasion. It shifts
the conversation from engineering to psychology, which is more vague,
therefore the problem isn't so obvious. But psychology comes from
substrate/code. When you reformulate your proposal in engineering
terms, the issue is obvious. What does "wanting" something mean in
engineering terms?
Martin
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT