From: Samantha Atkins (email@example.com)
Date: Thu Sep 19 2002 - 13:48:05 MDT
Ben Goertzel wrote:
> "Technology is the only way to overcome it" would be an overstatement.
> However, it does seem pretty clear to me that
> a) technology is on a path of exponential acceleration
> b) spiritual advancement of humanity is not
> Could b) undergo some kind of threshold effect that would allow it to
> overtake a) ?
> Sure, it's not impossible. But unfortunately, I see no reason to believe
> this, other than wishful thinking.
I don't think it will *overtake* (a) but I think any significant
increase in (b) could make our changes of surviving (a) much
better. The question then becomes, how much is "significant"?
> The "hundredth monkey" effect posited by New Agers for some time, has yet to
> be observed en masse..
Dunno. There have been large spiritual actions centered around
Gandhis and Kings of the world. There are apparent exponential
growths in certain kinds of memesets (not all of them by any
stretch "good") at times.
> I agree that it would be BETTER if humanity drastically improved itself
> mentally, emotionally and spiritually before launching the Singularity. I'm
> just doubting that's how it's gonna go down...
I think there are important connections between what type of
Singularity we kick off and how it is perceived and what we do
pre-Singularity to adjust our minds to even non-Singularity
technology and abundance. In a world playing the same scarcity
games with increasing technology you get faster and faster
versions of the game with greater amplitude which is quite
dangerous. You get increasing fear of becoming obsolete,
discarded and unable to fend in the face of technological
advance. The Singularity then could become the ultimate fear.
Why should the people believe that asymptote of technological
progress would change the trend they have struggled with?
I also think that we need to learn to think in a much more
"holistic" or integral manner with much more real care for the
maximization of all individual potentials before we get to
Singularity. Again, I don't see how just getting to Singularity
makes this occuring more likely if the groundwork has not been
laid. Sure the Singularity could simply establish this, plus or
minus "Transition Guides", but we need as much of this as we can
possibly get beforehand if we are to survive to reach Singularity.
I am probably expressing this inadequately. But I believe that
we sometimes make the mistake of overemphasizing the intellect
side of SAI and underemphasizing its "heart" - the deep
appreciation, caring for, compassion for, nurturing of all. A
god sized being without that would be extremely problematic.
Our conceptions of SAI are in danger of being unbalanced in much
the same ways that we ourselves lack balance.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT