Re: AGI w/ NO goals

From: Robin Lee Powell (
Date: Tue Dec 13 2005 - 15:10:34 MST

On Tue, Dec 13, 2005 at 04:43:34PM -0500, Eric Rauch wrote:
> Alright what if we give it a limited goal (i.e. self improve until
> you are smarter then every living human combined).

No, that would result in the destruction of every human on Earth, at
a minimum.

I think you need to go do some reading.

Start with


-- ***
Reason #237 To Learn Lojban: "Homonyms: Their Grate!"
Proud Supporter of the Singularity Institute -

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:54 MDT