Re: How Kurzweil lost the Singularity

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Jun 19 2002 - 14:57:37 MDT


Ray Kurzweil wrote:
>
> I think that Eliezer is misunderstanding my statements, intentions, and
> efforts.

If so, I do apologize. Thank you for taking the time to respond.

> First of all, he states "Kurzweil's entire *being* is directed
> toward predicting the Singularity - *not* nudging the Singularity in any
> direction." The fact is that the bulk of my efforts are involved in
> technology creation efforts, with only a portion devoted to talking and
> writing about technology. I do believe that as technologists, we have an
> ethical responsibility to apply our efforts in ways that will promote
> positive human values, albeit that we don't always have a consensus on what
> those are. Most of my efforts have been devoted to developing technology
> for persons with disabilities, and towards enhancing human expression in
> areas such as music, and I do give a high priority to considering the impact
> that technologies I'm involved in creating will have on society.

I understand and acknowledge that the majority of your efforts are directed
toward creating technology, rather than talking and writing about
technology. I am not accusing you of being a talker rather than a doer. I
do feel, however, that the Singularity is specifically the creation of
transhuman intelligence, and that not all technological progress contributes
equally to this. There are many (valid!) motives for contributing to
general technological progress, but the Singularity should be more than a
means of rationalizing whatever we were already doing.

One of the most commonly voiced objections to the Singularity is that it is
simply an atheistic religion - a way to replace the spiritual comfort lost
by the rational contradiction of earlier beliefs. I believe that if the
Singularity is treated as passive justification for existing plans, this
objection will become substantially correct.

There are thousands or millions of organizations that contribute to general
technological progress. Someone fortunate enough to be aware of the
Singularity at this point in history has the opportunity to directly
participate in technologies that lie on the critical path to the
Singularity, such as brain-computer interfaces and Artificial Intelligence.
  I often receive letters from young adults asking me how they can select a
college major or profession in order to contribute more to the Singularity.
    I don't tell them to go on whatever they were doing and that it will
probably contribute to the Singularity eventually in one way or another. I
advise them that the professions involved most directly in the Singularity
will probably be cognitive science or computer programming. My hope is that
this makes it more likely that Singularity-related projects will have eager
young geniuses available - such being a critical resource in science.

There are many efforts which unintentionally contribute to the Singularity,
but I believe that efforts which have been deliberately directed at the
Singularity - planned from the beginning with that sole goal in mind - will
prove beneficial, critical, and necessary to the Singularity. I believe
that along with the parts of the Singularity that depend on the efforts of
thousands of people and the expenditure of billions of dollars - such as
Moore's Law - there will also be the opportunity for critical discoveries
and inventions produced by small groups, and even breakthroughs (most
probably in the cognitive sciences) that are the product of individual
genius. I believe that by directing more resources at these leverage
points, where the path to the Singularity depends critically on specific
scientific or technological issues rather than on the whole world economy,
it is possible to accelerate the Singularity.

I am not denying that you have done a great deal to advance technology and
that some of this will indirectly contribute to the Singularity. But I am
concerned that your view of the Singularity allows no role for intentional
efforts to bring about the Singularity sooner - tying it solely to vast,
inexorable forces such as Moore's Law, which would be difficult or
impossible to accelerate without a planetary commitment of resources.
History, especially scientific history, is not always made by the horde.
The role of individuals in scientific history is often exaggerated but it is
certainly no exaggeration to say that a stroke of genius can accelerate
progress by years or decades. I believe that by encouraging individuals to
direct their efforts specifically toward the Singularity, it may be possible
to place on a firm basis scientific projects that might otherwise suffer
from poverty of resources or poverty of genius at a critical moment in history.

Let's suppose that someone proposes a project to construct a map of all
known brain areas and pathways, which map will contain links to all online
scientific literature which deals with that area or pathway. I would
recognize this project as critical to Artificial Intelligence, and you would
recognize it as critical to brain emulation. Therefore this project is not
just "scientifically exciting" but is *critical* for the whole human
species, and I would recommend funding it on the same level of resources and
attention that currently attaches to popular environmentalist efforts or
disease cures, and for the same humanistic justification.

But currently projects lying on a direct path to the Singularity are *not*
universally recognized as critical to humanity's future. Currently projects
like these are not glamorous enough to be guaranteed of plentiful funding
and many eager young researchers hoping to take part. We who are fortunate
enough to be aware of the future must therefore continue to spread awareness
of the Singularity *as a justification for action*, and direct our own
resources - whether time, money, or a scientific or entrepreneurial lifetime
- at those projects which are currently underfunded at the level which the
Singularity justifies. We must use our awareness of the future to nudge
humanity closer to the rational distribution of efforts. By trying to place
what we recognize as Singularity-critical projects on a firm basis *because
of* their connection to the Singularity, the overall course of scientific
progress toward the Singularity may be genuinely and significantly accelerated.

It is in this way that the Singularity meme helps to bring about the
*actual* Singularity, answering the objections of those who claim that the
Singularity is a vague, passive religious belief. The Singularity is a
humanistic goal which translates into a concrete research direction.

> I am familiar with Eliezer's efforts at defining and articulating ways that
> we can promote what he calls "friendly AI," and I applaud his concern and
> efforts in this direction. By itself, I don't believe that such efforts are
> sufficient, and Eliezer would probably agree with this. I don't think that
> we have enough knowledge today to define a reliable strategy to assuring
> that AI (or other advanced technologies) will remain "friendly," but the
> dialogue on how to achieve this is certainly worthwhile and not premature.
> It's an effort we will need to maintain and intensify, particularly as we
> get closer. I have said many times that these technologies are advancing on
> many fronts, and I believe that a critical aspect of assuring that these
> future technologies are helpful rather than harmful is that everyone
> consider and apply ethical issues in every project and in every decision.
> There's no one "magic bullet" strategy that is going to assure that we avoid
> catastrophic downside scenarios. I do agree, however, that it is not too
> early to define these downsides and to develop multiple strategies towards
> this end.

I believe that our civilization can pass through the Singularity
successfully; furthermore, that we can do so safely and smoothly rather than
in a state of last-minute panic - but only if considerable resources are
invested in safe passage *substantially in advance* of when it becomes
immediately necessary! And this will not happen if everyone who hears about
it thinks "Oh, someone else will do it," because right now someone else is
*not* doing it. I am not just talking about the Singularity Institute; the
Foresight Institute, for example, continues to be severely underfunded even
as our civilization hurtles headlong toward nanotechnology.

> So in summary I believe that Eliezer's efforts in this direction are
> important and worthwhile. However, he is not correct that I am unconcerned
> with this critical issue. I've said on many ocassions that it's the number
> one challenge facing our civilization in the 21st century.

One of the most profound statements I have ever encountered was an anonymous
quotation on Slashdot: "Beware 'we should...', extend a hand to 'how do
I...'" Since then I have never encountered the statement 'we should'
without thinking of this quote. The Singularity is the number one challenge
facing our civilization, with this I agree. Is it the number one challenge
facing you personally? I do not accuse you of being unconcerned, but how
does your concern change your actions from what they were before?

To put it bluntly, you have enormously more resources at your disposal with
which to effect a Singularity, but it appears to many of us in the
Singularity community that those resources are going unused. It's
frustrating! I understand that your resources are absolutely your own, to
dispose of as you wish, but I want to understand *why* you dispose of them
as you do. How can you declare that the Singularity is the meaning of life
and yet not change anything? If brain-computer interfaces are the critical
path to the Singularity, then why aren't you, say, investing in Neural
Signals Inc., which *right now* is struggling for venture capital - rather
than doubling their number of neural taps every 18 months, as they should
be? If nanotechnological safety is important, then why not fund the
Foresight Institute, which *right now* is sorely underfunded? If Friendly
AI is important, why not fund the Singularity Institute?

I understand your confidence in Moore's Law, which has hundreds of billions
of dollars or momentum behind it, but this spear has no spearhead unless
humanity can scrape up the tiny fraction of its resources needed to run the
one last mile that *does* depend on deliberate effort rather than the
planetary economy. Neural Signals Inc. would not exist without previous
research in MEMS and biotechnology and cognitive science, but you still need
a Neural Signals Inc. AI research benefits tremendously from Moore's Law,
but you still need a Singularity Institute.

I think some Singularity activists - I can't speak for all of them, but it
isn't just me - are puzzled, and a bit frustrated, because someone of your
stature finally "gets" the Singularity, and yet you seem content to
contribute in ways that are only peripherally related. If you go out of
your way at the Foresight Gathering to praise the researcher assembling a
neurocomputational map of cortical processing, why not toss a few bucks his
way? I understand that your life is your own. But *why*?

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT