From: Mitchell Porter (you_have_some_sort_of_bug@yahoo.com)
Date: Thu Apr 19 2001 - 18:23:22 MDT
--- "Eliezer S. Yudkowsky" <sentience@pobox.com>
wrote:
> Welcome to my life; I have the impossible task of
> simultaneously
> convincing AI researchers that they have a
> professional obligation to be
> paranoid, convincing futurists not to worry about
> the wrong
> (anthropomorphic) malfunctions, and convincing the
> general populace that
> AIs can be made at least as trustworthy as humans or
> human organizations.
When I saw the article, my first thought was, "He's
never going to get any work done ever again." But
that's wrong, this is just one article, not a cover
story, and even most of the people who read it will
"file-and-forget" it, as part of the same constant
parade that includes stories of gene discoveries,
photos from Hubble, hacked websites... So I really
don't think you are going to be required to convince
the "general populace" of anything, except in the
sense that even more people who aren't AI researchers
or futurists will be contacting you, as individuals
or as interviewers.
It's quite likely that the story will lead to
*some* major change in SIAI's circumstances,
but I think that would be because someone who
can make a difference read it, not because it
will lead to fame in general (with all of its
consequences).
That's my attempt to predict the future, anyway.
__________________________________________________
Do You Yahoo!?
Yahoo! Auctions - buy the things you want at great prices
http://auctions.yahoo.com/
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT