**From:** Marc Geddes (*marc_geddes@yahoo.co.nz*)

**Date:** Fri Nov 22 2002 - 20:24:55 MST

**Next message:**Gordon Worley: "Re: My probability estimates of various scenarios"**Previous message:**Cole Kitchen: "Can the Singularity be outrun?"**Next in thread:**Gordon Worley: "Re: My probability estimates of various scenarios"**Reply:**Gordon Worley: "Re: My probability estimates of various scenarios"**Messages sorted by:**[ date ] [ thread ] [ subject ] [ author ] [ attachment ]

My probability estimates of various scenarios

It really is a sad state of affairs that so many human

beings are so totally blind or apathetic that they're

doing nothing to help bring about the 'Singularity'.

And that the minds of so many are still gripped by

centuries old dogmas that are opposed to science,

rationality and progress (for instance fundamentalist

religion). What a terrible waste of life!

Being a visionary is both a blessing and a curse for

we understand how good the world could be, yet we are

daily confronted with 'the awful truth' which is the

present reality. The terrible frustration and

alienation that the visionary has to deal with because

he or she lives in an irrational world is enough to

drive him or her nearly insane at times :-( The

poignant Jack Vance fantasy short story 'Green Magic'

fully captures the essence of what it means to be a

visionary. It's well worth a read. The story can be

read on-line at this URL:

http://www.infinityplus.co.uk/stories/green.htm

That humanity will reach a 'Singularity' at all, or

that the outcome will be favorable is by no means

assured. Whilst Eliezer is having difficulty getting

grants of a few grand a year, did you know that there

is an organization called 'The Turning Point Project'

which has spent millions of dollars on big

advertisements attacking nanotech, A.I and genetic

engineering? Something is wrong with the world :-(

I'm going to give you my current Bayesian estimates of

various scenarios:

Probability that a Singularity will happen: 90%

If a Singularity doesn't happen, why not?

Technologically impossible or impractical:

5%

Humanity permanently stifled through techno-phobia

and/or local disasters: 10%

Humanity wiped out through global disaster:

85%

Assuming that a Singularity happens:

66% confidence interval for date of Singularity:

2030-2040

90% confidence interval for date of Singularity:

2020-2050

Probability of favorable outcome: 90%

Assuming that the outcome of the Singularity is not

favorable, why?

F.A.I cannot help humanity or doesn’t want to:

10%

Unfriendly A.I developed first or F.A.I undergoes

Failure of Friendliness: 90%

Assuming that the outcome of the Singularity is

favorable, what happens?

F.A.I implements 'Sys-Op' over all of human space and

rapidly offers uplifting to all: 30%

Multiple F.A.I's integrated with economy help develop

uplifting technologies gradually: 70%

Which person/organization will get to true A.I first?

A private person/group or a non-profit institute: 40%

A major company: 15%

A military R&D group: 10%

An academic research project: 30%

A public (open source) collaborative effort: 5%

=====

Please visit my web-site at: http://www.geocities.com/marc_geddes

http://www.yahoo.promo.com.au/hint/ - Yahoo! Hint Dropper

- Avoid getting hideous gifts this Christmas with Yahoo! Hint Dropper!

**Next message:**Gordon Worley: "Re: My probability estimates of various scenarios"**Previous message:**Cole Kitchen: "Can the Singularity be outrun?"**Next in thread:**Gordon Worley: "Re: My probability estimates of various scenarios"**Reply:**Gordon Worley: "Re: My probability estimates of various scenarios"**Messages sorted by:**[ date ] [ thread ] [ subject ] [ author ] [ attachment ]

*
This archive was generated by hypermail 2.1.5
: Wed Jul 17 2013 - 04:00:41 MDT
*