Does an AI need attention?

From: Gordon Worley (redbird@mac.com)
Date: Mon Dec 04 2006 - 09:47:30 MST


A recent post on Babel's Dawn [http://ebbolles.typepad.com/
babels_dawn/2006/11/necessary_and_s.html] got me thinking, does an AI
need attention?

Certainly any optimizing process has "attention" in some sense: when
you press down on the toaster, you tell it to pay attention to
putting electricity through high resistance wires to generate heat
for a certain period of time. But I think most of us would see this
kind of attention as not quite the kind we're interested in since it
lacks flexibility, i.e. a toaster can never pay attention to anything
other than putting electricity through wires for a certain period of
time. If we want more flexibility, we need to design it into the
system.

For example, evolution "designed" humans with attentions flexible
enough to focus on things other than what they encountered in the
environment of evolutionary adaptation. I can focus on writing this
e-mail, even though no ancestor of mine past a few dozen generations
ever typed, wrote, or read. You might argue that this is too close
to speech, but in some since that's my point. We are flexible enough
to focus on new things, but we typically focus on things in ways that
can be extrapolated from the ways we focus on things we were
"designed" to focus on.

The most flexible attention would belong to a self-modifying
intelligence who could pay attention to its attention and change it.
Humans have some limited capacity to do this (think of the first time
you saw a "Where's Waldo" puzzle versus the 5th or 10th time and how
you payed attention to your attention to improve your technique), but
they are ultimately limited by their brains' construction. An AI
could be programmed so that it could modify its attention, even to
the point of modifying itself not to need attention. So I ask again,
does an AI need attention?

In the style of LOGI, I think our only choice is to design an AI with
attention and let it modify attention out of itself if that is a
better design. Our best model for how a to construct a general
intelligence is the human brain, and humans have attentions, so we
would do best to design our AI with attention, making sure it can get
rid of it if that's a bad design.

P.S. Skimming through LOGI, it never seems to directly deal with the
issue of AI attention, but just assumes it. Maybe asking "Does an AI
need attention?" could be an exercise for seed AI wannabes. Speaking
of which, maybe a list of exercises would be helpful to those folks
(which, you know, could be constructed in the copious free time of
Eliezer et al.).

-- -- -- -- -- -- -- -- -- -- -- -- -- --
                Gordon Worley
e-mail: redbird@mac.com PGP: 0xBBD3B003
   Web: http://homepage.mac.com/redbird/



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:57 MDT