From: fudley (fuddley@fastmail.fm)
Date: Thu Jun 03 2004 - 10:34:49 MDT
On Wed, 02 Jun 2004 "Eliezer Yudkowsky" <sentience@pobox.com> said:
>I have not the vaguest idea of what you mean by "sentience".
Shame on you Eliezer, I believe this is a rare instance of you not being
entirely candid with us. Although like everybody else you have no
definition, I believe you do have an idea what is meant by the word
“sentience”; it’s the thing you have and a rock (probably) does not. I
also don’t see the point of inventing jaw breaking euphemisms like
“recursive self-improvement”, “optimization process” or "autopotence"
when there are already perfectly good words to convey the concept. Clear
language promotes clear thought.
>I don't see how anyone can make this huge fuss over sentience
>in AGI when you don't know how it works and you can't give me
>a walkthrough of how it produces useful outputs.
I don’t think I need to explain how intelligence has produced useful
outputs, or that random mutation and natural selection has produced
sentience; I will leave it as an exercise for the reader to connect the
dots.
John K Clark
-- http://www.fastmail.fm - And now for something completely different…
This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:22:38 MST