Re: A Hard Take Off.

From: Spudboy100@aol.com
Date: Fri May 10 2002 - 19:21:34 MDT


In a message dated 5/10/2002 11:46:38 AM Eastern Daylight Time,
deering9@mchsi.com writes:

> Joe: "Common resources?"
> Melissa: "There are lots of common resources. Public spaces, energy,
> computational resources. The general situation is, three days ago a super
> intelligent computer system can into existence. It built the structure and
> some other infrastructure inside the Earth and Moon. If a human's
> intelligence is 100 and mine is 641 FAI's is 3 X 10 ^ 43. FAI's
> capabilities are not unlimited but are quite significant.

This brings up a query of mine which impacts the arrival of a singularity.
Has the Foresight Institute, or any other business or educational
organization (technological forecasters?) ever undertaken a delphi poll on
when a singularity may arrive?

A delphi poll, for the mildly interested, is the practice of asking a
question, frequently in both essay and typical poll-taker format; regarding
the likelihood, and and effective date, for the arrival of a technology, or
event. The first run, the interviewee' is asked in private what their
opinions are, and if necessary, to elaborate with a short essay on why they
hold a certain view. The second run, is where they are asked the same
question, but others of the group of interviewee's are able to read each
others' answers. Supposedly, this is a useful means of technological
forecasting.

I wonder when a take-off, hard or smoother is really due? Who would be the
experts or cohorts in such a estimation? Would physicists, chemists,
engineers, who work in nanotech or materials science be the people to ask?
Would people in this list be the primary interviewee's, since most here are
involved as programmer/analysts and computer scientists?

<<Joe: "Common resources?"
Melissa: "There are lots of common resources. Public spaces, energy,
computational resources. The general situation is, three days ago a super
intelligent computer system can into existence. It built the structure and
some other infrastructure inside the Earth and Moon. If a human's
intelligence is 100 and mine is 641 FAI's is 3 X 10 ^ 43. FAI's
capabilities are not unlimited but are quite significant.>>
(A very entertaining conjecture-worthy of Drexler)

Moreover, what is the practical computational limit for matter we know
exists, rather then likely exists (neutonium/computronium/unobtainium). Is it
reasonable to expect a hyperintelligence to be so many times the unit of a
human intelligence? I wonder if intelligence, like the speed of light, has a
embedded limit? Does "smartness" cut off at 4 times human intelligence, or 4
trillion times or, does it exponentiate, asymtotically?



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT