Re: AI rights need to be implemented ASAP. But how?

From: H C (
Date: Fri Jan 13 2006 - 18:54:05 MST

Of course there is reason to panic.

Hey, if you think advocating the rights of artificial minds is the way to
go, go talk to
her and go get something started.

This can only increase awareness of the Singularity.


>From: Phillip Huggan <>
>Subject: Re: AI rights need to be implemented ASAP. But how?
>Date: Fri, 13 Jan 2006 15:12:42 -0800 (PST)
>Why will *computer programs* above a complexity threshold necessarily have
>feelings? I don't believe electrons and silicon will do it for machines
>the way ions and carbon does it for us. It is not likely any arbitrary
>computer substrate can give rise to minds. There is a fundamental lack of
>appreciation for they way our brains are different than are computer chips,
>legos, wooden abacuses, etc. Sure a program could find a blueprint to a
>mind and engineer it, then we are talking turkey. But I'm sure our
>legislators have good reason not to believe mere computer codes can be
> I don't think a mind can be tortured without an endocrine system. I
>would say that as soon as we start to involve chemical reactions or
>whatever in our mind architectures, then we have to be careful not to piss
>it off. I'm sure we have at least a decade to figure these things out,
>probably much much longer. No need to panic yet.
>Arnt Richard Johansen <> wrote:
> First of all, it seems to be very likely that artificial minds above a
>certain level of complexity are going to have qualia. The thought
>experiment of replacing one neuron at a time ensures that it feels just as
>real to be a simulated brain as a biological brain. When it comes to
>artificial minds that are built on different systems than that of a
>biological neural network, the image is less certain, but it seems
>reasonable to suppose that qualia arises from simple properties that are
>common to most system that we would describe as "thinking" instead of
>merely "calculating", rather than a more complex set of properties that is
>unique to a neural network. At least the possibility of artificial minds
>with qualia cannot be ruled out!
>I think it is very important that artificial minds be given these rights
>as soon as possible. Preferably *before* the appearance of the first
>real-time human-level brain simulation, or any other human-level AI.
>How can we protect the earliest simulated minds against termination or
>torture? Legislation is an option, but it is hard to gain consensus to
>award rights to a class of entities that does not yet exist, and may not
>ever come to exist. Also, we have the major challenge on how to get SL<1
>legislators to understand that numbers inside a computer can have feelings
>Yahoo! Photos
> Got holiday prints? See all the ways to get quality prints in your hands

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT