Re: guaranteeing friendliness

From: Ben Goertzel (
Date: Sat Nov 26 2005 - 10:07:33 MST

> Let me rephrase: I can imagine that over the next ten thousand
> subjective years, it would be possible, desirable, and necessary that
> you should, one change at a time, grow into a mind of which it was
> possible to prove that future self-modifications obeyed some invariant
> or other.

But this rephrasing does not explicitly address the question of
whether this "one change at a time" modification can be done in such a
way as to preserve subjective "continuity of consciousness."

I don't think this is a criticall important question, but I find it an
interesting one.

It could be that, if minds M and N are sufficiently different, then
"one change at a time" modifications from mind M to mind N at some
point necessarily involve some small change to underlying
mind-mechanisms which induces a large change in emergent mind-state;
and that this large change corresponds to a subjective feeling of
"discontinuity of consciousness."

I actually doubt this is true, but I don't have a strong argument for
or against the hypothesis.

> However, if you wanted to make the change to deterministic cognition in
> one jump, today, I think it would probably kill you.

Well, the use of the word "kill" here is somewhat peculiar, but I
agree that a sudden change from human mind to theorem-proving-based
mind would almost surely involve a radical subjective discontinuity of
consciousness. I guess that's something close to what you meant.

-- Ben

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:53 MDT