From: Emil Gilliam (emil@emilgilliam.com)
Date: Sun Mar 20 2005 - 14:51:36 MST
http://events.stanford.edu/byOrganization/498/
This is an upcoming talk for the Stanford Transhumanist Association.
Note that it is strictly a non-technical, SL1 talk, but nonetheless
those of you in the Bay area might find this interesting.
-----
Eliezer Yudkowsky on Predicting the Future
We are entering an era of accelerating technological change. Modern
advances in biotechnology, nanotechnology, and artificial intelligence
may soon revolutionize the structure of society and the nature of
experience. It has become increasingly important to forecast the future,
for taking corrective action only when cataclysmic shifts arrive may
prove futile.
The survival of human civilization may depend on our collective ability
to predict the forces of change, to better enable us to turn our great
potential into a benevolent future while averting dystopic scenarios.
Eliezer Yudkowsky will speak on how popular culture can distort our
perceptions of the future, and how we should go about rethinking these
models and predictions.
Eliezer Yudkowsky is one of the foremost experts on the Singularity. He
has spoken to many audiences about the important mission of the
Singularity Institute for Artificial Intelligence, where he is a
director and researcher. For some popular and accessible introductions
to the Singularity and other subjects, please visit www.intelligence.org.
Refreshments provided. RSVP requested.
March 28, 2005. 7:30 PM.
Approximate duration of 1 hour(s).
Location:
Main Lounge, Florence Moore Hall.
Contact:
650-799-8127
asphodyn@stanford.edu
This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:22:55 MST