From: Harvey Newstrom (mail@HarveyNewstrom.com)
Date: Sun Jan 23 2005 - 08:14:06 MST
On Jan 23, 2005, at 6:41 AM, Eliezer Yudkowsky wrote:
> Aside from that, I don't object to your statement of fact. You can
> indeed move faster the less you care about safety. We'll all die when
> you cross the finish line, but hey, you were first! Yay! That is how
> people think, and that is what makes the planet itself unsafe, at this
> point in time.
As a security professional, this is my concern with most areas of
transhumanist interest. Not just AI, but nanotech, biotech, robots,
computers, mind chips, etc. Our entire culture is based on rushing a
solution in place and not making sure that it is safe first. Many
transhumanists even argue that the number of people dying today that
might be saved with futuretech gives us the urgency to cut corners and
skip safety. Most groups only promote the benefits of the technology
and ignore the threats.
I agree totally with Eliezer. The planet is unsafe. Most of our
developments are unsafe. I love technology, and believe we can do it
safely. But sadly, nobody is.
-- Harvey Newstrom <HarveyNewstrom.com> CISSP, ISSAP, ISSMP, CISA, CISM, IAM, IBMCP, GSEC
This archive was generated by hypermail 2.1.5 : Wed Jun 19 2013 - 04:01:08 MDT