From: Johnicholas Hines (johnicholas.hines@gmail.com)
Date: Mon Feb 02 2009 - 09:45:46 MST
On Mon, Feb 2, 2009 at 10:47 AM, Stuart Armstrong
<dragondreaming@googlemail.com> wrote:
> I'd actually have to disagree with that. This sounds like the old "AI
> can't leave a box" argument, but with humans. The low ground is not
> much of an advantage if the high ground is much smarter (plus, the
> high ground might actually get physical, and build robotic bodies for
> its disembodied minds).
Second, surely in a conflict between one entity (A), and another
entity (B), there are various resources that would tip the conflict
toward one or another direction. Note that I'm being deliberately
ambiguous as to whether the entities are humans, emulated humans,
computer programs, or organizations.
Intelligence and speed of thought are certainly resources that would
give an advantage to one side over another. Being "lower" would also
give an advantage.
> The risk of slavery to uploads comes from other uploads, not from
> standard humans.
As far as I can tell, it's entirely possible that we could have a
future society where there are a vast number of emulated humans, and
systematic discrimination against emulated humans. The day-to-day work
of enforcing the society's rules might be police, just like now.
Police could have "low ground" advantages like turning people off,
powerful surveillance, powerful censorship. Even if the individual
directly enforcing society's (discriminatory) rules is also emulated,
that doesn't mean that the society as a whole is unstable.
Revolutions might occur, certainly. But there is no natural law that
says that truth, justice, and happiness prevail. We need to work
towards those things continuously. We cannot assume that they will
come along with new technology.
Johnicholas
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT