From: James Higgins (email@example.com)
Date: Mon Aug 13 2001 - 23:21:17 MDT
At 09:14 PM 8/13/2001 -0600, you wrote:
>Concerning the question of trade post-singularity, Christian L. scoffed as
> > If a human made a great painting, you could always tell the Sysop to
> > replicate one for you.
>For free? No royalties? You could also tell it to steal it. The question
>is whether it would do what you ask. (Napster has a lot to answer for!)
When we get to this stage, information will most likely be free. It has
already been discussed that privacy will most likely not be an option post
Singularity. Privacy is simply the protection of information about
you. If this is not feasible, I don't think it will be feasible to protect
intellectual property either.
> > Security? The primary goal of the Sysop is security. What use would a
> > bodyguard have if no one can harm you?
> 1) The Sysop will certainly provide basic security services to prevent
>major harms. Whether it will provide all possible security services, and
>whether it will agree with you as to what constitutes a harm are separate
>questions. People hire nontech security now not only to backstop the
>police, but also to do things the police won't, like put their bodies
>inbetween papparazi cameras and famous faces.
>2) Conspicuous consumption. (Or do you think that come the revolution we
>will outlaw conspicuous consumption?)
I thought this was the whole argument as to why we need a sysop? If the
thing can't protect you from petty things that a body guard can, then how
can it protect us from all that other stuff it is supposed to? It is
supposed to do your bidding, so it would have to honor YOUR version of what
constitutes harm to you.
> >If you want to learn something, who
> > would be the best teacher? A human or a Superintelligence?
>This may be the heart of our disagreement. I envisage a future with many
>"Superintelligences", an entire ecology, not just us humans and single
>parent Sysop. I assume specialization will triumph over one omnicompetent
>supercitizen. Perhaps we could design a world where each person only
>interacted with the Sysop, who recreated an entire artificial world for each
>of us designed to make us happy, but do we want to? I think one part of
>Eli's design of friendliness, with an accent on preserving citizen autonomy,
>is to design an AI that would reject that type of Sysop role. I expect the
>Sysop, and friendly major powers generally, to leave to lesser powers and
>humans (which may become lesser powers eventually for all we know) those
>tasks that they can handle.
Knowledge, the same all virtually all information, will almost certainly be
free. Thus it should be relatively easy to learn anything you wish to
devote sufficient time to.
> >And frankly, if
> > you are so extremely shy that you need to PAY people to talk to you, the
> > Sysop kan kindly ask your permission to rewire some basic social skills
> > you.
>Considering the billions people currently pay for conversation and
>associated activities, from therapy and sexwork to bars and bowling leagues,
>I suspect that your vision of the future is not that everyone develops
>social skills, but that everyone gives up on them and just lives in a
>solopsistic world with the Sysop.
It may not be his "vision" for the future, but it is a real
possibility. Have you ever heard of that rat where they hooked up an
electrode directly to its pleasure center connected to a button? If I
remember it correctly the thing died because it never stopped pressing the
button even to eat.
Having a Sysop that will provide you with a VR world of your choosing is
pretty dam close to just that. Plus it will protect you from any harm and
limit personal interactions to only those you choose to participate
in. I'm afraid most of the human race may very well vanish into their own
> >What do you mean by trading space? How many cubic
> > light-years do you need? For what?
>I just need a location close to my friends and far away from people who are
>rude or aggravating. Funny how that always seems to cost something. And if
>you think some people will not try to grab prime real estate, such as the
>sphere around any star out to the orbit of the third or fourth planet, you
>have never met a real estate developer.
"Real Estate" will for the most part be virtual. It won't matter who you
are adjacent to in any way. If everything is computronium, the only thing
that matters is how long it takes to get signals to and from your chosen
associates locations. Besides, as I understand it the Sysop will be
responsible for allocating resources (matter). Thus a "real estate
developer" could only amass matter if they could get people to give it to
them. And as matter will be the all important resource, I doubt anyone
intelligent would seriously be willing to part with any of theirs.
> >Why would there be a shortage of space?
>Why are land values high in Los Vegas, when there is empty desert all around
>it? Why do people pay for domain names, when there are so many unclaimed?
> > The only truly valuable thing post-singularity is probably only matter and
> > energy, or only energy if you consider E=mc^2,
Knowledge & information will also be valuable, but they will be free. Much
like Linux is valuable yet free.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT