From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Nov 21 2000 - 13:05:13 MST
Ben Goertzel wrote:
>
> But wait! not quite.... Hold on. It's always going to be MORE efficient
> to hold a provisional assumption and forget that it's
> provisional, on some level.... Given fixed resources, intelligence and
> mental-health/enlightenment/inability-to-be-shocked will always
> contradict each other. The question is, if the fixed resources are LARGE
> ENOUGH, then perhaps this inevitable tradeoff will become
> less of a significant factor than it is in the human mind....
Leaving out the whole SL5 part, which I'm not sure I understood (the
implication is that our true transhuman descendants will be shocked by the
life we lead today? I didn't get that part), insanity - in most goal
systems - would be a major disadvantage, leading to all kinds of negative
outcomes. If there really is a threat of insanity, then it'd be damned
silly to optimize away the extra 0.01% of storage space that contains the
very root of the causal chain, the part that marks the rest as
provisional. And if there's the faintest, most remote chance of any
assumption being wrong - even in cases like "Oh, the whole Universe was
just a computer simulation" - then to mark that assumption as
"undoubtable" is not only a foolish risk, but is in fact outright untrue.
On a deeper level, I disagree with the entire visualization this implies.
Why can't we just say that transhumans can *deal* with it when their basic
assumptions get challenged? That they don't have the *hardware* for
running around in circles, biting their own tails; that they just deal
with it, whatever it is, and move on. Then you can optimize whatever you
like; if something goes wrong, you recognize it, de-optimize, and move on;
easy as clearing a cache.
As humans, we get extremely emotionally attached to our own ideas. This
happens for several reasons, of course; the two major ones are (a) the
political emotions, under which backing down from an idea is not only a
truth/falsity thing but also affects your social status; and (b) we have a
really lousy, hacked-up pleasure/pain architecture that causes us to
flinch away from painful thoughts. "Insanity" *and* "inability to deal
with change" are not emergent phenomena that will appear in all
sufficiently complex minds. Calling a phenomenon "emergent" always sounds
really enlightened, I know; but in this case, it's just not true - at
least, as far as we know. Our vulnerabilities are traceable directly to
specific design errors, and there is really no reason to think that these
vulnerabilities would be present in minds in general.
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT