Re: Embodiment

From: Chris Capel (pdf23ds@gmail.com)
Date: Mon Oct 17 2005 - 14:40:50 MDT


On 10/17/05, Michael Wilson <mwdestinystar@yahoo.co.uk> wrote:
> Chris Capel wrote:
> > It's a position, though perhaps not an assumption, held by SIAI; and
> > as far as the views and research of the SIAI are one of the main
> > topics of this list, and this list is owned by a founding member of
> > that organization, the assumption can be said to be widespread within
> > the context of this list. But it's most certainly not widespread among
> > AI researchers as a whole, with yourself as a good example case.
>
> This issue is a complicated one; you can't divide the positions various
> people take into a dichotomy or even a single linear spectrum.
[snip]

Interesting points, which I'm not qualified to reflect on, and
relevant to the previous thread. But for the record, I was referring
in the quoted text to the

> assumption that "strictly humanoid intelligence would not likely be
> Friendly ...[etc.]"

which seems to be an issue apart from embodiment, and more amenable to
limited generalization. I'm a bit puzzled why you quoted me, but no
matter. My opinion on embodiment, which I stated in a previous mail
(that it's not "obviously necessary [for] a human equivalent or higher
intelligence"), is my own, and not very sophisticated.

Chris Capel

--
"What is it like to be a bat? What is it like to bat a bee? What is it
like to be a bee being batted? What is it like to be a batted bee?"
-- The Mind's I (Hofstadter, Dennet)


This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:23:04 MST