RE: How hard a Singularity?

From: Ben Goertzel (ben@goertzel.org)
Date: Sun Jun 23 2002 - 12:36:08 MDT


On the other hand, perhaps the delayed Singularity will result in a
superhuman AI that cares about humans and thus, once it invents a time
machine in 2123, bothers to go back in time and resurrect all those dead
humans....

I do not advocate intentionally delaying the Singularity, though -- UNLESS
as the time nears, a bad outcome seems more likely based on evidence we
don't have now.

-- ben g

> -----Original Message-----
> From: owner-sl4@sysopmind.com [mailto:owner-sl4@sysopmind.com]On Behalf
> Of Brian Atkins
> Sent: Sunday, June 23, 2002 12:19 PM
> To: sl4@sysopmind.com
> Subject: Re: How hard a Singularity?
>
>
> Eugen Leitl wrote:
> >
> > On Sat, 22 Jun 2002, James Higgins wrote:
> >
> > > to get the Singularity in full force the more people die (that's my
> > > interpretation of his vision, at least). Personally, I'd rather let a
> > > few hundred thousand people die while making certain that the
> > > Singularity won't just wipe everyone out. I mean, what's the point in
> >
> > I agree with this assessment.
> >
>
> Just a pedantic nitpick, but if Eugene gets his future where all countries
> pass and perfectly enforce laws against AI development, and it therefore
> takes at least 20 more years before we get some alternate Singularity
> technology such as uploading, we are talking about quite a few more deaths
> than a "few hundred thousand":
>
> 20 years * 365 days/year * 150k deaths/day = 1095 megadeaths
> --
> Brian Atkins
> Singularity Institute for Artificial Intelligence
> http://www.intelligence.org/
>



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT