RE: Normative Reasoning: A Siren Song?

From: Chris Healey (
Date: Wed Sep 29 2004 - 11:40:23 MDT

On 9/28/2004 Peter C. McCluskey wrote:
> ...
> I'm having trouble figuring out why we should worry about
> this kind of problem. We have goal-systems which are
> sometimes inconsistent, and when we need to choose between
> conflicting goals, we figure out which
> goal is least important and discard it as an obsolete sub-goal.
> The qualities you mention above seem like sub-goals that I would be
> willing to discard if they conflicted with more important goals
> such as happiness.
> ...

This is a sub-goal stomp. As far as the platform of the human mind is
concerned, happiness can be viewed as a sub-goal of survival (itself a
sub-goal of reproduction), as its intesity correlates with actions
that are likely to result in increased reproductory success in the EEA
(i.e. you can't reproduce if you're dead).

Based on the first part of your quote above, I infer that you might be
indicating that the strategy of analyzing our goals for consistency
will renormalize our sense of happiness, and I'd agree that to some
degree it can. The problem lies in that our minds are more-or-less a
finely-tuned mess.

While the neocortex has had a great deal of time to adapt to the
limbic brain and utilize its capabilities to a reasonable extent, the
limbic brain is pretty entrenched, since it has spent the majority of
it's own evolution outside of the existence of the neocortex. In
order for the limbic brain to adapt fundamentally to the more recent
presence of the neocortex would involve a reworking so deep that doing
so would render the intermediate stages inoperable. Any adaptation
would tend to involve surface variations and newly emerging complex
functional adaptations, and ignore existing core complex functional
adaptations. So outside of EEA-compatible situations, we cannot
always trust the accuracy of our sense of happiness at directing our

The situation you mention is also a limited wireheading event, since
happiness is effectively an aggregate metric, contributed to by a
myriad of mental sub-modules indirectly suporting maximum reproductory
success. By elevating to a super-goal priority the maximization of
such a metric, you've introduced a strange loop into your goal system.

That might not be a good idea if you consider consistency to be a
desireable thing.

-Chris Healey

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:49 MDT