From: Ben Goertzel (ben@goertzel.org)
Date: Thu Sep 19 2002 - 21:19:15 MDT
> On Tuesday, September 17, 2002, at 12:10 PM, Ben Goertzel wrote:
>
> >>>> If you want the Singularity to happen you
> >>>> should be doing what you can to work towards it.
> >>>
> >>> I personally AM, but I don't believe this is the optimal path for
> >>> everyone...
> >>
> >> If a person wants to see the Singularity happen and is capable of
> >> contributing to it's creation, under what conditions would working
> >> towards the Singularity not be an optimal path?
> >
> > If that person was capable of contributing *much more strongly* to other
> > valuable things like helping humans concretely in the present world.
>
> This is going to have to be a huge amount of help. See the WTA
> Deathwatch for the numbers. For example, if that person's helping made
> the Singularity occur one week earlier, if he didn't work on the
> Singularity then I think that it would be fair to expect him to see to
> it that the number of people who would have died in any given week make
> it to see the Singularity.
There are a hell of a lot of assumptions going into the piece of reasoning
you're presenting.
For example, you're assuming that bringing the Singularity about earlier is
automatically better. (If you're not quite assuming something this extreme,
you seem to be assuming something close.)
Samantha Atkins & Vernor Vinge are two intelligent, Singularity-savvy
individuals who have argued that an earlier Singularity may not necessarily
be better. Perhaps, they've suggested, the Singularity is more likely to
come off well for humans if we have longer to prepare ourselves for it.
Samantha has been quite explicit in presenting her reasoning in favor of
this point.
My point of view is actually not that far off from yours. I think a
Singularity in 20 years is probably better than one in 50-100 years, because
a) I'm fairly pessimistic about the ability of humans to become highly
ethically advanced without some kind of serious hardware/wetware
modification...
b) I'm quite worried about some particularly ethically un-advanced humans
inadvertently using advanced technology to destroy the human race (while
trying to destroy only the parts of the human race they don't like...)
But anyway, if one believes that a more ethically, spiritually and/or
cognitively advanced humanity is more likely to lead to a positive
Singularity, then one is very well justified in spending one's time focusing
on helping humanity to achieve these qualities.
The use of quantitative estimates of "number of people saved from dying" by
certain actions is heuristically interesting, but should be taken with many
lumps of salt, because there are a LOT of major uncertainties involved in
all prognostications about the Singularity and the path leading up to it.
-- Ben G
-- Ben
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT