Re: [sl4] End to violence and government [Was:Signaling after a singularity]

From: Bryan Bishop (kanzure@gmail.com)
Date: Mon Jun 30 2008 - 11:47:20 MDT


On Mon, Jun 30, 2008 at 12:01 PM, Stuart Armstrong
<dragondreaming@googlemail.com> wrote:
>> Is that what this was getting at? I've lost track of how this 'small
>> cost' is done away with by a government -- you can just as easily
>> consider the ai or the government as a damaging entity who does you
>> harm in one way or another.
>
> The cost (small and large) is done away by the government because the
> government offers the possibility of retatilation on anyone threatning
> or using violence - so they don't do it. And of course the AI or

You're just replacing one sharp stick with another.

> government can become a highly damaging entity; that's why the present
> model is to have a liberal democracy with strongish property rights
> and individual rights. This seems to be the best model so far in

Most of the ai scenarios that I remember reading in the literature,
the ones that sounded like ai domnination takeover scenarios, involved
computer hacking and the 'illegal' downloading of information, and yet
here we see that it's supposed to obey such copyrights and property
rights and individual rights and whatever else? That's a peculiar
contradiction, but I'm just pointing this out. It's not the big issue
at stake here.

I don't see how "property rights" has any hold over a hot stick.
"Property rights" is a social construct that bacteria, for instance,
do not need to uphold, simply because property rights are of human
society and are based off of human thoughts and so on. So, the same
with ai and in fact the same with humans and thus how we are able to
have thieves, modern pirates of the black market of financial
transactions circulating the globe (or at least other parts of the
globe, I haven't seen signs of them hanging out in North America
much).

So, the model is wrong. Now, allow me to go strawman for a few
seconds. I suppose that you will follow up with an argument about the
social pressures that could be induced either as they are in modern
society by making the cost of getting caught too high, or via ai
know-it-all scenarios. In the case of increasing the cost of thievery,
that's only a selective pressure on catching thieves and doesn't
actually solve the fundamental problem in the first place -- that
people have stuff, and they'd really prefer to keep it. You're just
going to make sure that all of the thieves are going to get wickedly
good at avoiding security. As for the ai know-it-all scenarios, where
they are able to make everyone not thieve because everybody's desires
are known, you're assuming that the sharp stick (the ai) is able to
make arrangements for the extraordinary demands that some might make.
Now, in alternative situations, these guys would be left on their own
to make it happen, perhaps with the support of some fellow ai systems
that they instantiate or come across, perhaps not. Anyway, the ai of
course couldn't "give the world on a silver platter" to somebody,
since that's obviously in conflict with all of the other wonderful
things that the ai domination game would supposedly involve. Etc. The
model is wrong -- it doesn't actually address the physical systems and
the possibilities that we find in existence, and it's really a royal
pain. Let's fix this.

> practice, for enforcing contracts, maintaining law, and causing the
> minimum of suffering to the population. There might be superior models
> after a singularity, but I still don't see how you can do without the
> basic outline of a government.

I am now considering that I might not have been as elaborate as I
should have been in the past, so I'm going to walk another few limbs
and point out what a singularity means and how governments, especially
nation-states and so on, wouldn't be the only solutions, and I am
going to babble, a lot :-). Now, we're not going all anissimovic here,
and a singularitty is in general where we have the exponential
explosion of growth that in one way or another allows the explosion of
physically implemented artificial intelligence. So what does that
mean? You have lots of hardware, and lots of software, and lots of
interfaces and all sorts of interesting tools that you can now play
with. There are many functions that a government serves, many of which
can already be replaced with the technological advances of the past
hundred years but that haven't quite made it all the way back to the
beginning. And in the case of a singularity, suddenly the amount of
information and new technology that could be used to update the
system? That becomes truly amazing. I'm not saying that the amazing
factor is what necessitates the reconsideration of the government,
although it certainly does help on the emotional scale of things.
Rationally, anything more efficient for getting the same goals done
and so on are well worth the consideration, so what about all of these
technologies that allow us to communicate and diffuse information now?
And what about all of the tools and technologies that allow us to
manage our own lives to an even greater extent? Suppose, for instance,
that we are worried about health care. Heh, government-provided health
care? When talking about a singularity? Hell, just build a
robot-scientist/doctor and load it up with ai, what's the problem? The
government isn't needed any more in that case, you have a robotic
doctor, a walking encyclopedia of knowledge, tools and methodology
that doesn't need to be paid. And what about all of the need for
protection and security? There are many possibilities that have been
developed over the past century. Because of advancements in
manufacturing, it would be possible for everyone to have personal
automatic immune systems on a larger scale, perhaps for their entire
(maybe mobile) homes? And so on, and so forth. Massive amounts of
information and programs can be compressed down into small computers
to support an individual with the thosuands of years of civilization's
development, and so suddenly we have people walking around that are
more informed, more up to date, and more capable than the government.
Should they happen to need space or land in which to live, for some
random reason, why would it have to go through the government? Just
hook up to the local astrophysical DNS, go find some unused materials,
and start processing them. In that case, the person has successfully
escaped somebody on the planet's surface that was threatening them,
etc. So in that case, the government wasn't needed. Truthfully,
everyone knows when they are feeling pain, and I suspect an
interesting strategy to try out would be to let people manage their
pain intake. In other words, if you're in pain, the hospitals already
give you access to a morphine pump, you just press the little button.
No, I'm not talking about letting everyone get high on drugs. Rather,
I'm talking about letting them manage their bodies *as they already
do* so that they can get their work done, whatever it is that they
want to do. I see little reason for a government strategy here since
everyone is minimizing their own suffering in the first place. And
what happens when there are disputes? In the case of social disputes,
there's little stopping anyone from just spawning a new society, just
as they already do. This would become especially more easy and more
possible with the vasts amount of ai floating around, the knowledge
databases, the flat-out knowhow. There's already some projects that
are interested in establishing this functionality, like some of the
space pod projects.

Hrm. I have dumped a lot of random crap on you, but I hope it will
help. Please digest and tell me more about sharp stick theory. :-)

- Bryan



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT