RE: How hard a Singularity?

From: Ben Goertzel (ben@goertzel.org)
Date: Sat Jun 22 2002 - 16:00:58 MDT


> Oh come on and take a stand :-) If you were betting money you would bet
> on AI before perfect anti-aging/anti-disease/cryonics + personal
> spacecraft
> right?

Yeah, sure...

I think cryonics could possibly be cracked before AI if $100 billion were
put into each.

But cryonics research is harder to do on a shoestring budget, so many fewer
people are working on it -- even fewer than on general intelligence, sadly!

On the other hand, with virtually unlimited funding, I think AI would come
about before robust anti-aging/anti-disease

I don't know much about the technological challenges involved in making
affordable personal spacecraft. I'll buy one once it becomes available
though ;>

> > I have thought about this very seriously and I think that
> superhuman AI is a
> > MORE risky path than human uploads. There are a lot more unknowns with
> > superhuman AI; we are dealing with a different sort of embodiment AND a
> > different sort of mind all at once.
>
> A human upload that modifies its mind into transhumanity has both the same
> problems.

Agreed, of course... but in Eugen's future, there could be laws preventing
uploads from transcending.

Yucky to think about!

ben



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT