From: Michael Vassar (firstname.lastname@example.org)
Date: Sat Dec 17 2005 - 17:16:38 MST
Here's the classic paper on the topic.
Also see "the siren song of normative reasoning" in the sl4 archives, and
numerous discussions of anthropomorphizing by Michael Wilson and Eliezer.
Three sentence summery
The prior probability of our caring about an optimization process or of an
optimization process being conscious in a way that we care about is small.
Natural selection is an optimization process optimizing the degree to which
processes are self-optimization processes.
Normative reasoning approximators are strong-attractors for any process that
moves through a phase space towards being an optimization process.
>Michael Vassar <email@example.com> wrote: For all practical
>purposes, given a competitive environment, recursive
>self-improvement is not an acceleration of natural selection, but something
>much closer to an immediate jump to natural selection's equilibrium
>end-state, and this state is not likely to be conscious in any sense that
> Could you expand upon that a bit? How do you determine natural
>selection's 'equilibrium end-state'? How do you determine it is not likely
>to be conscious in a way we care about?
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:54 MDT