Re: [SL4] Employment vs. Singularity

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Aug 21 2000 - 00:41:14 MDT


Samantha wrote:
>
> If, just as a for instance I don't know how to get to, if every person
> had a decent living wage and all the physical necessities taken care of
> (a la cheap nano-tech matter assemblers for instance), then every person
> can be quite busily employed - doing exactly what they themselves find
> most meaningful and interesting regardless of whether they are getting a
> conventional paycheck for it.  Personally I would have a hell of a lot
> more to do than watching TV if I no longer had to work for someone else
> for a living.  I have more things to do and explore and work on than I
> could finish in a hundred lifetimes of such 'pointless empty leisure'.

How about if everyone who wished was uploaded and had a decent amount of
computing power - at least a billion trillion brainpower, say?

This scenario is basically the minimum SingInst is aiming for.  With
luck, we won't see any major effects on the economy from any prehuman
AIs that reach the market before this point.  (With less luck, there are
a few economic tweaks that could stabilize things; but that whole
scenario can ideally be avoided.)

You seem to be visualizing a slow, gradual, planetwide scenario.  The
Slow Singularity model has all kinds of potential for ultratech
disaster, which is one reason why the SingInst model consists of one AI
in a research lab reaching the point of true self-enhancement and taking
off from there.  The marketable AIs before that point will hopefully be
too dumb to have any major effect on the economy.

About the friendly AI thing:  Complex issue; we think we know how to
handle it; no, it doesn't involve Asimov Laws or gratitude to creators
or anything else anthropomorphic; yes, given everything we know about,
it should work; no, success isn't certain; yes, we're pretty sure the AI
scenario offers the best available probability of survival for the human
race.
--
        sentience@pobox.com    Eliezer S. Yudkowsky
               http://intelligence.org/home.html




This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT