Re: Rights for AIs are not a problem

From: Keith Henson (hkhenson@rogers.com)
Date: Mon Jan 16 2006 - 15:56:12 MST


At 11:05 AM 1/16/2006 -0600, you wrote:
>Keith Henson wrote:
>>Corporations already have legal rights, a company can own the hardware on
>>which the AI is implemented. Corporations can own stock, so a corporate
>>AI that owns a controlling block of stock in itself is a free agent with
>>effectively the same rights as meat persons.
>
>No, I don't think a computer hardware/software system can own shares in a
>corporation. Neither can other objects like a vase or rock.

You missed the point. A corporation can own stock in corporations
including itself. The corporation can own anything including hardware.

>Generally only humans, and other business entities can legally own shares
>AFAIK.
>
>Also, I don't think a hardware/software system could legally be appointed
>into any corporate office, such as CEO. At best, you could appoint some
>humans into those positions whose employment contracts or perhaps the
>corporate bylaws or something state that they will only take actions
>suggested to them by the output of the hardware/software system. But if
>you're forced to arrange things like this, then why bother with all the
>corporate hassle when in the end you are just property with no legal
>rights, and relying on a group of people to do what you ask - might as
>well skip the corporation and just hook up with a single human owner and
>be their personal property.

The point of the post I was responding too was how to provide rights for
AIs. This is a route, a "legal fiction" if you will that could be
exploited to generate "legal persons" who would have rights within the
scope of existing legal frameworks.

If Eliezer is right about the speed with which a seed AI goes way beyond
humans, the problem will be humans keeping any rights at all rather than
figuring out how AIs of human level or above should be treated at law.

_Pirates_ pg 135.

"Yes, Admiral," said Cantrell. "It's time to put the wagons in a circle."

The computer saluted and turned itself off.

"A question, please," Vong said. "I gather from talking with Captain Tower
that the admiral-Corporate Susan, if you prefer-was responsible for the
operation freeing the Madame G. Y. Fox. Despite its tactical brilliance,
the maneuver has put you into a serious bind with Japan, and yet you have
not taken any punitive action. Why?"

"What can you do to a computer?" Cantrell asked. He took a sip of coffee.
"Besides, I don't have anyone else for the job." He finished his coffee and
put the cup down on the napkin. "Maybe Corporate Susan will learn from the
mistake-which wasn't your run-of-the-mill thumb-fingered idiocy, by any
means. I guess it comes down to loyalty. I pick the best people I can find
and back them."

Keith Henson



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT