[sl4] Re: Unlikely singularity?

From: Aleksei Riikonen (aleksei@iki.fi)
Date: Thu Aug 07 2008 - 02:02:44 MDT


On Thu, Aug 7, 2008 at 9:45 AM, Stuart Armstrong
<dragondreaming@googlemail.com> wrote:
> Unlike most people on this list, I don't think that the singularity is
> likely to happen, at least not anytime soon (in a nutshell, the reason
> is that I think that diminishing returns, and poorly phrased
> questions, will beat exponential feedback).
>
> However, I'm on this list because if a singularity does happen, it
> will be (of course) the most important transformation imaginable. So
> the importance getting involved is still very high, even if the
> singularity is unlikely.
>
> I was just wondering if there were other people like me on the list,
> and if this is a good audience to target. Basically by definition,
> this list self-selects those who believe a singularity is likely. I
> think we critically lack well informed people (I don't include myself
> in that list), who see a singularity as plausible, but unlikely. If
> there were some of these, we could better test the strength of the
> singularity arguments, against real opponents (and not against those
> who just wave their hands and say "that'll never happen!").

Your question would benefit from an explicit definition of what
exactly you mean by "singularity".

One definition would be nothing fancier than the creation of
human-surpassing AI. To me, that seems rather inevitable at *some*
point (assuming a technologically advancing civilization continues to
exist). It also seems that once we have only mildly human-surpassing
AIs, we will rather quickly have greatly human-surpassing AIs, if not
by other means then by adding huge amounts of computing power, thereby
increasing the AI's thinking speed by an astronomical factor.

Personally, I don't make the claim that we'll have human-level AI
soon. And I *hope* it is still a long way off. If I somehow became
convinced that a thousand years of slow progress is required before we
get there, I would be near-ecstatic. This is because things could be
done a lot safer if we had more time before someone else would do it
anyway. ("It" being the creation of entities astronomically more
powerful than us, possibly such that they aren't under our control,
and have goals significantly differing from ours.)

What makes the singularity scenario worth a lot of attention is that
we can't rule out it happening in a couple of decades. It's not
necessary to claim that it's likely to happen soon, even if there were
grounds for such a stronger claim. (If I were to start presenting
grounds for such a claim, I would start by guesstimating when we are
likely to be able to emulate -- and subsequently modify -- the human
brain, and thereby human intelligence, on a computer.)

-- 
Aleksei Riikonen - http://www.iki.fi/aleksei


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT