From: Psy-Kosh (firstname.lastname@example.org)
Date: Sun Dec 14 2003 - 10:09:14 MST
-----BEGIN PGP SIGNED MESSAGE-----
> > I feel sorry for the woman of 2015.
> Why? Seems like a silly thing to worry about. Of course people
> end up with emotional bonds with artificial intelligences -- and I
> don't really see how that will be terribly awful. In the long run,
> all will be running on non-biological substrate, and I don't see
> discriminating in favor of biology is a good thing in the long
I don't think that is the concern. As I understand it, the concern is
that these are _not_ AIs, just, well, extra fancy Eliza proggies that
are specificly tuned to give strongly simulated emotional reactions.
And, analogous to how in the ancestal enviroment, going after every
bit of sugar was a good thing, us going after certain emotional
reactions is a good thing.. But just as too much sugar is bad for us,
now that we can get it in concentrated form, it may very well be that
recieving too much of certain emotional reactions, in far excess of
what actual humans produce, may be psychologically unhealthy.
Again, as I understand it, these are _not_ AIs.. just chatbots that
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.2-nr2 (Windows XP) - GPGOE 0.4.1
-----END PGP SIGNATURE-----
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:43 MDT