From: Samantha Atkins (samantha@objectent.com)
Date: Fri Jan 02 2004 - 02:25:40 MST
On Wed, 31 Dec 2003 06:46:11 -0800 (PST)
Tommy McCabe <rocketjet314@yahoo.com> wrote:
> I have several objections to the points raised in this
> essay. You say that the Singularity is a bad term for
> these reasons:
>
> "It fails, IMO, to address the important point: the
> coming change, whatever you call it, will not be
> created by technology in any respect (technological
> growth, new technologies, what have you). It will
> happen if, and only if, greater than human
> intelligence becomes part of humanity's reality. Then,
> and only then, will we truly have reached a point
> where things have fundamentally changed because then
> we're dealing with minds creating technology that are
> fundamentally alien to us, or at least better. At that
> point, things have truly changed."
>
It doesn't even necessarily require strong AI. It could be acheived (more slowly) by a reasonable chunk of increasing human knowledge and intelligence augmented by technology being turned back on the problem of further augmenting human intelligence and decision making. Greater than human intelligence could be acheived holistically through IA over time. At some point we would not be in Kansas anymore.
-s
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:43 MDT