From: Chris Capel (email@example.com)
Date: Thu Dec 01 2005 - 09:37:59 MST
On 12/1/05, keith <firstname.lastname@example.org> wrote:
> It might not be so easy to control a machine whose
> intelligence was based on language. Like in the novel 1984,
> the meaning of words can change. Just look at the US
> Perhaps language referring to objects and behaviors could be
> tied down. But all the really important language humans use
> refer to subjective feeling states, for which the machine
> might have no computational referents.
> If it has no aesthetics or goals of its 'own', other than
> the admonition 'be nice'. It could only know it was being
> 'nice' by interrogating individual humans that have the
> subjective capacity. The 'feelings' to know what 'nice' is.
> If we actually discovered what 'pleasure' was, and
> programmed the machine to take 'pleasure' in making humans
> 'feel nice', A friendly machine might engineer for us all a
> kind of heaven I guess.
> But perhaps a better goal, rather than friendliness, might
> be to strive to make humans wise. Even if most people
> 'prefer' their delusions. 'Prefer' their gods and their
> prejudices. Even if they don't actually 'like' the truth or
> wisdom very much.
-- "What is it like to be a bat? What is it like to bat a bee? What is it like to be a bee being batted? What is it like to be a batted bee?" -- The Mind's I (Hofstadter, Dennet)
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:53 MDT