Re: Singularity Institute: Likely to win the race to build GAI?

From: Phillip Huggan (
Date: Wed Feb 15 2006 - 18:05:21 MST

Or other technologies might intervene in the meantime to render the eventual emergence of an AGI impossible or emerging bringing less than existentially decisive ramifications.

  SIAI employees are not the only people trying to save the world :) For all we know the person who saves the world may be someone who invents a bioweapons sensor or maybe an aid worker in the Developing World whose influence convinces some future national leader not to turn "evil". My attempts to drop the price of synthetic diamond products may lead to diamond weapons catalyzing WWIII and SIAI's AGI might kill us all. If this is about existential risks and human qualities-of-living, AGI is surely important but hardly the whole story.
Kaj Sotala <> wrote:
  From Mike Dougherty:
> I point I don't understand is why anybody thinks there is an AGI "race"

There is one, though not for the reasons you describe. To be more
precise, there is a race between non-Friendly AGI and a Friendly one. If
the UFAI is built first, and a hard take-off occurs, mankind might be
doomed because the UFAI will have the chance to destroy us before
there's anything powerful enough to stop it. If the FAI is built first,
mankind might not be doomed, as the FAI is able to consolidate itself
into a position where it can stop any potential UFAI's.

Sounds like a race to me.

Brings words and photos together (easily) with
 PhotoMail - it's free and works with Yahoo! Mail.

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT