Re: A Power's Nostalgia of Hairy-Ape-Life (was: Re: Singularity Memetics.)

From: Alan Grimes (
Date: Thu Jan 31 2002 - 18:53:12 MST

> The consumption of power in convincing a human being to upload would be
> negligible to an SI, therefore it would be easier to build new SI's or
> SI components out of simply transcend-ifying human minds which are
> *already there*, and more ethical too. So both the SI and we win.

But how much energy would the "upload" take. How effective would the
upload mind be?

Let be break this to you: the human mind, even my own!, is a _HORRIBLE_

Trying to translate it into an SI would take many orders of magnitude
more effort than just stamping the brick with an engineered mind.

> This SI "machine", as you put it, would be very, very far from our
> current conception from what a machine is. This "machine" would be
> much more like "God" (not in the Christian sense) than any "machine"
> seen up to this point. For example, it could take on the appearance of
> your long lost father or lover, or give you a feeling of complete bliss
> when in its "presence". And those are very anthropocentric examples.
> A true SI would be "more human than human", in the godliest sense we
> can comphrehend. Super empathic, super ethical, super nice, just an
> all around great guy to be around! =D


You cannot possibly guarentee that things will be anything like that.
You assume, incorrectly, that the SI will take any interest at all in
your pleasure or safety.

Your fantasy is NOT my fantasy.

It is becoming increasingly apparent that your delusions are almost as
big as Eliezer's ego... =(

Maybe I'll go off and start a list where me and Goertzel can hang out...
Even though he is an uploader (PUKE), he appears to be sane. =P

> If you wish to continue arguing this point further then please mail me
> directly, and spare the list. Thank you kindly.

I think the list can bear another round....


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT