From: Jeff Medina (analyticphilosophy@gmail.com)
Date: Fri Aug 26 2005 - 11:52:03 MDT
On 8/26/05, Robin Lee Powell <rlpowell@digitalkingdom.org> wrote:
> Something that I haven't really seen talked about here is that
> keeping a sentient being in a box isn't something that "just results
> in the singularity being delayed". It's massively immoral, as well.
>
> It's slavery *and* imprisonment. Both without proven cause; the AI
> is guilty until proven innocent.
If one has reason to believe a human might accidentally destroy the
world, it would be incredibly immoral *not* to quarantine the person
until evidence the threat is gone.
We already do this when the threat is much smaller -- for example, we
quarantine people with contagions that have no chance of destroying
the world, but merely (!) might cost hundreds or thousands of lives.
-- Jeff Medina http://www.painfullyclear.com/ Community Director Singularity Institute for Artificial Intelligence http://www.intelligence.org/ Relationships & Community Fellow Institute for Ethics & Emerging Technologies http://www.ieet.org/ School of Philosophy, Birkbeck, University of London http://www.bbk.ac.uk/phil/
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:52 MDT