From: Eugen Leitl (eugen@leitl.org)
Date: Wed Jun 26 2002 - 10:23:17 MDT
On Wed, 26 Jun 2002, James Higgins wrote:
> Humans have a terribly hard time understanding each other, even when
> they are mostly thinking the same. Fully and exactly communicating
> the essence of an abstract concept to an intelligence which isn't even
> wired the same way will almost certainly be more difficult. If people
It is in principle possible to directly read out emotional state from
properly instrumented (CNS invaded with nanofilament sensors) subjects.
I.e. once you tell the system what the observables are, it would able to
extract them with an acceptably low error margin. In fact if you just
instrument the subjects you don't have to bug the whole place, and the
same infrastructure to evaluate emotional response can be also used to
control behaviour at motorics level, and to adjust behaviour via tweaking
of planning, controlled hallucinations, selective inducement of agnosia,
and similiar. The user experience would be completely transparent.
If you don't like where this is headed, don't fret. We will make sure that
you'll love it. Because love rocks our world.
> can very rarely accomplish it between each other the odds of
> accomplishing it with an AI are very, very low. Please note that
> appearances can be very deceiving. Even though two people discuss
> something and agree that they completely sync on the meaning and
> details involved, they rarely if ever are.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT