Re: [sl4] A model of RSI

From: Matt Mahoney (matmahoney@yahoo.com)
Date: Fri Sep 26 2008 - 18:55:46 MDT


--- On Fri, 9/26/08, Eric Burton <brilanon@gmail.com> wrote:

> I'd like to clarify. I'm not saying we shouldn't be concerned by
> planet-level extinction events. But one thing we know about biological
> life is that once it comes into existence it persists in some form for
> at least billions of years and, as presently in evidence,
> indefinitely.

It has happened at least once, so far.

> Perhaps technological progress is a similarly indomitable process,
> once bootstrapped. Then we could worry about it superseding or
> otherwise trampling us. But this doesn't seem like a more immediate
> risk to civilization than the crises we're exposed to by scientific
> backwardness. Machines are a long way from competing for space and
> resources with us, and that situation is generally improving. I'd
> argue that outbreaks of grey goo are a sign of a civilization that has
> solved a great deal of its other problems!

I have a hard time picturing what disaster caused by a failure to achieve AI would be worse (in the context of human ethics) than a gray goo accident killing all DNA based life. Could you elaborate?

-- Matt Mahoney, matmahoney@yahoo.com



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT