Re: Shock level 4 (was Re: META SL4)

From: Matt Mahoney (matmahoney@yahoo.com)
Date: Fri Apr 25 2008 - 11:29:20 MDT


--- Thomas McCabe <pphysics141@gmail.com> wrote:
> > My abstract self concludes:
> >
> > - I am not a singularitarian. I want neither to speed up the
> singularity nor
> > delay it. In the same sense I am neutral about the possibility of
> human
> > extinction
>
> Are you totally neutral about the possibility of getting shot? If no,
> the former includes the latter. If yes, please seek psychological
> help immediately.

My abstract self is neutral about being shot, which my emotional self
finds disturbing.

My emotional self cannot think at SL4. (I hope I am using the terms
correctly). Emotions are the axioms of beliefs. The ideal abstract
self that I would use to think at SL4 would have no biases or emotions.
 It would think in a mathematics with no axioms, where everything is
wrong, even the questions are wrong. I can plug in some assumptions
and draw conclusions, but this does not make them right.

At SL3 we want to avoid an AI that tiles the solar system with
paperclips. That would be bad. At SL4, I imagine how a paperclip
making nanobot would feel. Making paperclips has positive utility for
them. There would be no other intelligences, so therefore it would be
utopia.

What about the humans? At SL3, humans would be extinct. That would be
bad. We could upload, but the paperclip makers would kill us to take
over our computing resources. That would also be bad.

At SL4, an upload would produce an entity with your memories that
claims to be you. That is all you can say about it. It is irrelevant
whether you consciousness transfers because qualia is not an observable
phenomena. Furthermore, if your memories were altered, you would not
know the difference. Finally, if you found joy in moving atoms one at
a time to make paperclips, you would be happy. Therefore, tiling the
solar system with paperclips is a good outcome for some definitions of
"you" and some definitions of "good".

Sorry, there is no right answer because there are no axioms.

-- Matt Mahoney, matmahoney@yahoo.com



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT