From: Stephen Reed (reed@cyc.com)
Date: Tue May 21 2002 - 17:36:09 MDT
On Tue, 21 May 2002, Christian L. wrote:
> My gut feeling is that the military would
> consider research aimed at the Singularity as something potentially very
> *harmful* to the nation. If the research is successful, it would mean the
> END of the government and the military. My feeling is also that there are a
> great deal of narrow minded people in the government/military sector, who
> perhaps are afraid of such a massive upheaval.
>
> Don't you think that the military would use their funds to support AI
> intended for warfare instead of, say, Eliezer's Friendly AI project?
As I understand Friendly AI, especially regarding "unity of will" between
the FAI and its programmers/knowledge engineers, then the military would
create a FAI along the lines Eliezer proposes. You want an AGI on your
side, the *right* side, and it needs to know why, in a very deep sense.
For example, Darpa and other government funding agencies are seeking ways
to improve our intelligence: the Information Awareness project will sort
through massive data and identify those nuggets of information which,
when known, could forestall an adversary's attack. Good intelligence is
the least expensive defense. Think about how a FAI could assist our
Intelligence Community where the most scarce resource is the skilled human
analyst.
> Have you yourself talked to people at DARPA (or other military programs)
> about the Singularity? If so, what was their reaction?
Mainly that we are crackpots.
Darpa/(ARPA) has funded AI/Nanotech from the start and senior project
managers there don't expect to see real AI in their careers at Darpa.
But don't see that the wrong way. In fact it is to be expected and easily
understood from the perspective of history. Think about the Wright
brothers showing the US Army their Flyer. Think about Gattling showing
the Army his machine gun in the Civil War. But give the military a
concrete indication of what can happen with funding in the next
budget/deployment schedule and much can happen fast. It took Einstein to
get the government moving on the A-Bomb after there was proof of the
Uranium chain reaction.. billions were spent from precious war resources.
I expect that this forum, and others attracted to this vocation in the
near future, will create software providing glimpses of AGI behavior,
perhaps on vast clusters of existing hardware (my ambition), or on
supercomputers.
You can simulate the hardware environment of the future by massive
concentration of computer power today. Each 10x growth in your complex is
roughly equivalent to one equivalent unit 4.5 years in the future
according to Moore's law. My guess is that about 40,000 AMD 64 bit
computers (on the market next year) could be enough to deliver
the computer power that a single computer might have in 2023. Suppose
that someone's AGI software next year could harness the power of that
many AMD-64 computers and show the government that the Singularity has
evidence of happening. Then the funding would follow; the proper research
protocols put in place - in my opinion.
The key with government funding, and to a lesser degree commercial funding,
and the reason why I mention Cycorp's sponsors (as an example to others
here), is that you need not claim AGI as your ultimate goal. You do not
even have to claim that you are making steps towards AGI in the
future. You need only claim the tools you have now, or can extend with
sponsor's funding, are useful, unique, having AI qualities that sets them
apart from all other software systems. Cycorp claims only to have the
worlds largest commonsense knowledge base, and a useful reference
ontology. These attributes set us apart from all other software systems
now, and are useful to our government sponsors. Our business model is to
accept government funding for interesting projects that advance our AI
goals as a *side effect*. You get your first government contract on a
great presentation, but you get subsequent contracts only by satisfying
the managers of your previous projects.
When there is strong evidence that the Singularity is possible, then we
propose projects, seek sponsors to achieve that goal in a public
(non-classified), regulated fashion - again just my opinion.
-- =========================================================== Stephen L. Reed phone: 512.342.4036 Cycorp, Suite 100 fax: 512.342.4040 3721 Executive Center Drive email: reed@cyc.com Austin, TX 78731 web: http://www.cyc.com download OpenCyc at http://www.opencyc.org ===========================================================
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT