From: Phil Goetz (email@example.com)
Date: Fri Jul 22 2005 - 15:32:19 MDT
> I think you're the one who's missing the point, which is
> that as humans,
> we need to create a transhuman which is *actively*
> (Friendly). This actively benevolent transhuman will then
> be able to
> figure out how to deal with transhumans which are not
Why would this be a good thing?
I think it would be better to talk about how to create
transhumans who are better than us, in the ways we
consider important. If we are talking on a moral basis,
this will entail creating transhumans who make better
moral decisions. Those transhumans can then judge
better what the moral thing to do is.
If they judge - as they very well might - that the moral
thing to do is to replace humans with transhumans,
aren't they more probably right than you are? And aren't
you acting immorally by trying to stop them?
Isn't this whole train of thought repugnant to someone
who really understands the promise of TRANShumanism to
TRANSCEND our human limitations, including moral
limitations? What you're really talking about is
putting a leash on transhumanity and halting evolution
right here, for your own selfish interests. A page
out of Leon Kass's book.
Perhaps the problem to be wary of is making people who
are smarter, stronger, faster, and more powerful than
ordinary humans, but who are really still just humans.
Do You Yahoo!?
Tired of spam? Yahoo! Mail has the best spam protection around
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT