From: Mark Walker (tap@cgocable.net)
Date: Thu Aug 02 2001 - 02:25:46 MDT
----- Original Message -----
From: Amara D. Angelica <amara@kurzweilai.net>
> Is there a consensus on this list that the Singularity is good and should
be
> accelerated?
>
>
If I understand Brian's post he seems to be in agreement with others that
the logically prior question here is "which Singularity?" Is it the case
that all our future histories end up at THE Singularity, or is it possible
that there might be different ways of effecting A Singularity? Having
answered this, then we can go on to ask about the timing of a (the)
Singularity. I take it that the position of the Singularity Institute is
that it is possible to affect which Singularity we pass through--otherwise
why all the fuss about Friendly AI? Certainly, in the absence of definitive
knowledge that we cannot affect our future history, this seems like the
correct default assumption. (If we struggle to affect a certain Singularity,
and in fact there is only one form of the Singularity, then at worst our
efforts will have been in vain. If we do not struggle to affect a certain
Singularity, and in fact there are multiple Singularities we might pass
through (some better, some worse), then we may be culpable for the worst
form of negligence in human history.).
My own view then would be that we ought to accelerate friendly
Singularities and attempt to impede or deflect unfriendly or afriendly
Singularities in favour of the former.
(I wonder if some people's thinking is misled by the choice of pronoun
here. Generally, would it be better to speak of 'a Singularity' rather than
'the Singularity' in order that this question is not prejudged? Although I
must admit, 'A Singularity Institute' does not quite have the same ring to
it). Mark
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT