From: Алексей Турчин (firstname.lastname@example.org)
Date: Wed Sep 26 2007 - 13:45:53 MDT
We can't be sure what kind of value system will have postsingular civilization. We can't prove that all postsingular civilization will care about us. And so we should think about passive SETI as about dangerous activity - at least until we get strong AI (but it also could be tricked by very strong AI).
From: "Vladimir Nesov" <email@example.com>
Date: Wed, 26 Sep 2007 23:15:20 +0400
Subject: Re: Is SETI dangerous?
> On 9/26/07, Aleksei Riikonen <firstname.lastname@example.org> wrote:
> > The post-AGI civilization very well might have a value system such
> > that it doesn't wish to kill living things (or even upload them
> > without telling them, though that might be ok with me personally),
> > when it costs so very little -- relatively speaking -- to leave them
> > alive, and only harvest the non-living matter found elsewhere.
> It doesn't cost little: any intelligent entity must be controlled, otherwise
> there's a risk of runaway replication. And physical world doesn't seem to
> provide a way to control material entity safely that is not essentially
> equivalent to uploading it to a limited substrate.
> Vladimir Nesov mailto:email@example.com
Посетите мой Живой Журнал www.livejournal.com/users/turchin - и узнайте то, что я думаю прямо сейчас - и ещё то, что хотел сказать вам, но не успел :)
This archive was generated by hypermail 2.1.5 : Thu May 23 2013 - 04:01:31 MDT