Re: [SL4] Employment vs. Singularity

From: Samantha Atkins (samantha@objectent.com)
Date: Tue Aug 22 2000 - 09:13:17 MDT


Brian Atkins wrote:
>
> Samantha Atkins wrote:

> I would say it is unlikely in my opinion that some kind of huge level of
> unemployment, high enough to significantly slow down the Singularity, is
> going to happen in the next 10 years. Say 5% chance IMO. Past 2010 my
> estimate would increase, but I can't give you a percentage. But even at
> 5%, that is worth chatting about.
>

fair enough.
> >
> > > I'm waiting for your ideas...
> > >
> >
> > Why? So you can rip them apart?  I would first like to know that you see
>
> Do you have something against peer review? :-)
>

No.  But I do have some sensitivity to times when I don't feel like I'm
being heard or reasonably fully responded to before the objections
start.  But in these last few messages, I am no longer worthy about that
being the case here as much.

> > the possible problem.  Post singularity I can see where a free in the
> > sense of beer economy for most of today's commodities could work just
> > fine.  Pre-singularity I do not see so clearly at all.  Frankly I think
> > we need to start retraining ourselves that massive consumerism and ever
> > escalating economic competition is not the way to get to a more
> > harmonious and ultimately richer future.  But I don't know yet how that
>
> Why not? It has led us this far.
>

Why not?  Do you actually think that competition for the sake of
competition (which made some sense in a more scarcity based world)
especially augmented severely will actually lead to relatively
harmonious conditions?  Do you think that existing power centers
(governments of sizes and shapes, corporations, wealthy groups and
individuals)  should grab up augmentation tools and higher technologies
in order to become ever more powerful and eventually to make their power
incontestable?  Do you think the current practices of corporate
espionage, strategic lawsuits, patenting of everything imaginable, lack
of consumer responsibility, adherence to only or primarily the current
bottom line, lack of environmental or extra-company consciousness and so
on will lead more toward peace and a good life for most people as the
technology increases the ability to exercise these characteristics,
particularly if the technology is largely owned by this class of
players?  Do you think this sort of setup is conducive to the full
benefits of nanotech and AI being acheived?   I don't see how.
  
> > comes out to individual, group and national decisions and policies.  I
> > think we need to evolve solutions rather than be in a great hurry to
> > simply jump on one from the past or a barely cobbled together modern
>
> Ever heard the expression "don't fix what isn't broken" ? But I digress,
> I do want to at least discuss the possibilities.
>

For those who refuse to look, it ain't broken.  Go live on the streets
for a while and then tell me that all the people there are simply
worthless bums and that the system is just fine.  As good as many things
are in the current system there are also problems.  Some of them are
systemic.  Pouring ever more energy into a system with such systemic
flaws without addressing them will result in breakdown.

> > one.  The first step is simply noting where the current system is likely
> > to have serious deficits that we would very much like to improve upon
> > without losing benefits of the current system as much as possible.   I
> > certainly agree this will be a neat trick.  I am not enough of an
> > economic theorist to put out what I see a reasonable whole solution.  I
> > am not sure anyone can.
>
> Have you read the writings of Robin Hanson. He is an economist who has
> studied (as much as anyone at this point I think) the possible effects of
> nanotech and other advanced tech on society. Nanotech and AI being your
> bogeymen, I think you should check this out:
>

Nanotech and AI are NOT bogeymen to me.  Failure to consider the shape
of the world and the shape we would want it to be as we can more power
to form it is the bogeyman.   Nanotech and AI are a major (to say the
least) infusion of energy and power and knowledge into the system.  To
suppose that the system we have that largely evolved to fit existing
conditions is appropriate when those conditions change utterly
regardless of the fact that the context making the system works is fast
disappearing is folly.

Thanks for the links!  I will check some of them out before I post again
on this subject.

> Reading his stuff, it appears that first off: nanotech without strong AI
> leads to a gradual improvement of manufacturing, but not some kind of
> instant overnight "everything is free, and we can do anything" scenario.
> It still will take a lot of brainpower to design these nano things.
> Secondly, if AI research does make progress and some commercial products
> start making inroads, his research shows that it actually will increase
> wages and jobs up to a certain point (where we get strong AI/Singularity).
>

You cannot have nanotech on any major scale without enabling strong AI.
It is doubtful if you can get full nanotech without strong AI. They are
fairly entertwined.  It does take a lot of brainpower.  Enough that the
need is another factor driving us fast toward strong AI.  Because it
takes more than brain power and a few CAD tools.  It takes true
intelligent decision making at speeds and quantities not acheivable by
less than strong AI.


> > In the world posited the nanotech can quite easily whip up an airplane
> > for this fellow at little/no cost.  Material things simply need not have
> > any associated price tag once the technology is sufficiently advanced
> > and the proper design patterns for the things desired are known and
> > their development costs amortized.  This is an important point.  So your
>
> Well I think there is more to it than that. The guy has to pay at a minimum
> for the material the plane is made of, and the energy required to create it.
> Plus delivery costs to wherever he lives, insurance most likely, fuel costs,
> maintenance, hangar space, training lessons, etc. Flying a plane will still
> not be anywhere near free. Let's face it that nanotech alone is not going
> to make for a perfect world.
>

If the plane is made of common elements freely available (diamonid plane
would be really cool) then there is no real material cost.  Depending on
the type of plane and its energy needs you may have a point on the cost
of fuel.  Although even conventional fuels (and we could do a lot better
assuming nanotech) would be a great deal cheaper to produce than today.
Delivery costs?  Depends on whether it is built on site or not.
Insurance?  Against what?  The possibility of unrecoverable injury are a
lot less with full nano-medicine.  Maintenance?  Self-maintaining.
Hangar space?  Depending on the level of nano-magic it might quitely
dissamble between uses.   Training you probably have a point on unless
by that time we download many types of instruction fairly directly
except when we just think it is more fun to do it the "old-fashioned
way".

I never said it would make a perfect world, just that we could get an
awful lot closer to such with this kind of resources available IF we
envision what kind of world we want to produce and use these increasing
abilities to move in that direction.  Personally, if the only choice was
to have a hyper-powered version of the same stuff we have today I
wouldn't be very up for this at all.  I could even see it as a curse. 



> > dreams and enjoyment are only limited by your imagination, creativity,
> > dedication and that of any others you can interest in your venture.  It
>
> This doesn't sound any different than what we have today. Nanotech is not
> going to make it much cheaper for the average Internet startup as far as
> I can see. It isn't going to make it cheaper for them to advertise or create
> a brand, or to pay their programmers and other human capital, etc. etc.
>

What makes you think you would need to do a conventional internet
startup to put together a team of people interested in an idea and in
making it real and useful to a bunch of people?  Who the heck needs to
create a brand?  That is behavior needed when you are trying to beat out
other companies by forming consumer loyalty.  It is a mug's game.  The
only loyalty I want is one based on merit of the ideas and their
implementation.  Who needs to pay programmers in a world where you don't
have to be paid to have your needs and many of your wants met?  Thinking
of "human capital" or any need to think of people as "capital" is one of
the things I believe we can do better than. 

We are already seeing the beginning of this in some of the open source
projects.

> > is like pure capitalism in that people voluntarily form associations
> > that they perceive to be in their interests.  It is just that the
> > interests don't have anything really to do with money necessarily or
> > with "making a living".  They have to do with the joy of creation and of
> > exploring and of building teams and such.  Values that some who are
> > already independently wealthy know about today.
>
> I just find the whole "everyone stops working because of nanotech" scenario
> unbelievable. So what if nanotech can whip me up basic goods- it isn't going
> to help when I want to buy some land for a house, or have custom designs
> for said house, or provide the cash I need for going to a sporting event
> or other live event, etc. etc. Sorry, but in a world of nanotech you will
> still be working 9 to 5.
>

They don't stop working.  They stop working "for a living" and start
working to make the things that truly move them real.  Who says the
sporting or live event will cost any cash?  Maybe the players do it
simply because they love it.  You can work nine to five "for a living"
if you want to.  I will not.  I will work far more than that but I will
work for joy of what I am doing.  I've done more than my share of the
other kind.  It sucks. 

>
> Whatever.. but I think I have pointed out well enough that nanotech alone
> is not going to provide a very big chunk of your dream world. Because you
> need the human creativity to really do most of the work in the new economy.
> Therefore you really need strong AI to give you the freedom you want. Agree?
>

No.  It is a great deal more than just "whatever".  That is pretty
dismissive of the points I raised.  Nanotech leads straight to AI and
vice versa.  It isn't very reasonable imo to fission them apart that
much. 

You need human creativity.  Sure!  I agree!  You need it far too much to
squander it in competitive systems like we do today.  Systems where the
wheel is reinvented over and over again because each competitive group
wants to "own" as much of what it depends on as possible and because
there are large legal, economic and secrecy hurdles to reusing what has
already been done elsewhere.  This has especially been the case in
software.  After 20 years in the business I think I ought to know.
Software talent is much too rare, too needed and too critical to waste
in such pointless duplication or by tieing the algorithms and techniques
produced to single companies for decades at a time with patents.  If we
don't go to a fully open source model in software soon which means in
effect that we largely stop putting a price on the software, then I do
not believe the software - which is much further behind in the race to
singularity than hardware - will ever come together sufficiently to
power Singularity level change. 

Another manifestation of this issue is the question of the flow of
information being rich enough and fast enough with as little friction as
possible in order for the thought and planning and creativity leading to
Singularity to occur. We are seeing battles there because the old models
of owning and controlling information are not terribly compatible with
modern means of storing and moving and sharing information or with the
modern need and appetitite for information and for the ability to use,
modify and transform it freely.  And information is fast becoming the
center, the real wealth, of our economy.  But you say we don't need
change and that the old models are good enough still?  How?

I, personally, would like to have enough of my living costs taken care
of to devote full-time to writing software to more directly fuel the
singularity. But, like most of us, I am caught up "making a living" and
there aren't that many ways to do that and be writing AI code or even
tools to rocket-assist software development itself.  I have managed to
position myself to where I design and lead efforts to produce software
that I think at least cleans up some areas of the software mess and I
get paid pretty well to do so.  But these specialties of mine aren't as
much in the area I think is most important to get us most quickly to
singularity-tracked software. 


> >
> > If we head for Singularity full-bore and gather various powers and
> > abilities along the way and do not change some of our habitual ways of
> > thinking, then I am afraid we are headed for wars and conflicts that
> > will make everything that has come before look tame in comparison.  If
>
> Exactly- which is why we hope to complete our AI before nanotech or
> uploading become available! It's just too dangerous to give SL1 humans
> access to such technologies.
>

Hey!  Something I pretty well agree with you on.  But I don't think you
can get to that kind of AI without largely freeing information and
software from current economic constraints.

Hmmm.  So this response of yours makes it clear that you do see things
wrong with the current system and the attitudes present.  I'm a little
confused.  I'm not sure what you think is and "isn't broken". 
 
> > we can't persuade a critical mass that the results of Singularity and
> > the steps getting there are to their and their children's benefit then
> > we can expect huge pushback to the point of armed conflict.  If the same
> > provincial and hyper-competitive attitudes go forward we can expect use
> > of these technological advances for forms of espionage, warfare and
> > control of other people quite a bit more dire than what we have already
> > seen.  So no, I don't always think that just heading for the Singularity
> > fullbore alone will save us.  On the other claw, it often appears to me
> > that our ability to reason and work through such conundrums is much too
> > limited without massive augmentation of our abilities which are part of
> > what leads us full-bore to Singularity.  One can only hope we grow our
> > hearts and consciousness at a sufficient speed not to abuse our
> > increasingly powerful minds (natural and artificial) and the other
> > technologies coming on line too catastrophically.
>
> Well it sounds like you might agree with us after all- doesn't it make
> the most sense to just sneak up on people with a full Singularity before
> they realize it, and before nanotech and uploading? After the Singularity,
> assuming all goes well, all of these issues will be moot and everyone
> can choose to live how they like (again I suggest reading Diaspora).

Well, I have mixed feelings.  I certainly deeply understand the
impulse.  But it is an impulse to play God.  Not that there is anything
necessarily so wrong with that that we shouldn't.  Someone has to take
the next step - especially one this HUGE.  But the responsibility is
awesome indeed.  And if we mistep the consequences are dire almost
beyond imagining.  I'm certainly not saying we shouldn't go for it.  But
we should think long and hard about how we want to proceed, how best to
get there with the least pain, and how to exercise what small amount of
control we have to insure (to the extent we can) the truly fabulous
results we hope for.  I think we can make it to the unbounded future.
But putting our heads down and charging forward determinedly is not
sufficient by itself.

- samantha




This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT