From: Joshua Fox (joshua@joshuafox.com)
Date: Thu Jul 20 2006 - 08:17:12 MDT
Thanks, Neil. It's great to see that sort of exposure for Singularitarian
ideas.
Perhaps what SIAI and Singularitarianism need are a variety of persuasion
methods in addition to pure logic. Logic comes first. But to garner support
from cognitive-fallacy-ridden humans, other approaches may be more
successful.
One example is the Argument from Authority, implicit in trusting famous
names. Another is the Argument from Fiction, which Eliezer speaks against --
yet that is what captures the imagination. And there's the Argument from
Popularity -- if enough people believe something it must be true. Nonsense,
of course, but a powerful device when dealing with humans who must filter
out competing memes.
Space travel, another extremely promising, yet strangely stuck,
technological field, has recently received billions of dollars from quite a
few 30-ish high-tech zillionnaires. Most do this because of raw excitement,
rather than rational calculations or the desire for scientific advancement.
Unfortunately, the Singularity does not convey the visceral thrill of
flame-spewing rockets, though it should. Most people do not seek excitement
in spending eternity proving mathematical theorems -- nor even in curing
world poverty, disease, and death. Grabbing people's emotions is a challenge
in itself.
All this is secondary to the core work of making Friendly AI happen, which
is of course logic-based. But if support is necessary, these modes of
persuasion may be necessary too.
Joshua
On 7/19/06, Neil H. < neuronexmachina@gmail.com> wrote:
>
> I'm not sure how prominent you would consider him to be outside the
> blogosphere, but the Instapundit Glenn Reynolds (who also happens to be a
> law Professor at the University of Tennessee ) is quite interested in
> Singularity ideas, having mentioned it several times at instapundit.com.
>
> Here's an email interview he had with Ray Kurzweil:
>
> http://instapundit.com/archives/025289.php
>
> Also, an podcast interview with Vernor Vinge:
>
> http://instapundit.com/archives/029925.php
>
> From the description:
>
> I'm interested in the Singularity, and I'm a big fan of Vernor Vinge's.
> > He's got a new book out next week called Rainbows End, set in 2025, and as
> > I've mentioned before it's pretty much an Army of Davids kind of world. He's
> > also the author of such previous classics as A Fire Upon the Deep and A
> > Deepness in the Sky.
> >
> > We talk to him about the Singularity -- and how it may come from the
> > superhuman "ensemble behavior" of ordinary humans with powerful computers
> > linked via the Internet rather than through the development of superhuman
> > artificial intelligence -- about signposts indicating how we're doing, about
> > humanity's prospects for utopia or extinction, and related minor issues. We
> > also discussed writing science fiction (the secret, he says, is "brain
> > parasitism," taking advantage of readers' smarts), whether college is
> > becoming obsolete, mind uploading, and the joys (or lack thereof) of
> > virtual-reality sex, a question that perplexes Helen.
> >
>
> On 7/12/06, Joshua Fox < joshua@joshuafox.com> wrote:
> > > Singularitarians already have the conceptual endorsement of several of
>
> > > the smartest scientist polymaths not explicitly associated with the
> > > singularity. Obviously Feynman, with respect to MNT, but also Von
> > > Neumann with respect to self-replication and GAI.
> >
> > Thanks, Michael. I have great respect for the opinions of those two,
> though
> > in addition it would be good to have a number of living endorsers, who
> can
> > engage in ongoing discourse.
> >
> > If SIAI, or even the Singularitarian philosophy in general, could get a
> > strong endorsement from Hawking, Pinker, Dawkins, Dennett, Minsky,
> McCarthy,
> > any Nobel Prize winner in physics or other relevant areas, or just a
> line-up
> > of leading professors from top universities in a number of fields, it
> would
> > probably bring some supporters and donors on board.
> >
> > In response to the question "Why should someone fund the Institute's AI
> work
> > instead of other AI work?" Eliezer gives an answer, but adds that "to
> > appreciate this answer on its own terms requires AI and cognitive
> science
> > expertise." Most people do not have this expertise and must rely on
> people
> > they trust.
> >
> > Joshua
> >
> >
> >
> >
>
>
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT