From: Martin Striz (metastriz@yahoo.com)
Date: Sun Jan 23 2005 - 16:30:05 MST
--- Harvey Newstrom <mail@HarveyNewstrom.com> wrote:
> As a security professional, this is my concern with most areas of
> transhumanist interest. Not just AI, but nanotech, biotech, robots,
> computers, mind chips, etc. Our entire culture is based on rushing a
> solution in place and not making sure that it is safe first.
There are plenty of people willing to protect us from ourselves, when the time
comes.
> Many
> transhumanists even argue that the number of people dying today that
> might be saved with futuretech gives us the urgency to cut corners and
> skip safety.
This reminds me of arguments against the FDA, proclaiming that millions of
people are dying because drugs don't get to the market fast enough. Then
something like Vioxx happens, or the antidepressant-suicide link, and you
realize that maybe we're not moving slow enough.
We are in a constant struggle to find just the right trade offs.
Martin
__________________________________
Do you Yahoo!?
The all-new My Yahoo! - What will yours do?
http://my.yahoo.com
This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:22:51 MST