From: Thomas McCabe (firstname.lastname@example.org)
Date: Tue Jan 29 2008 - 21:41:54 MST
On Jan 29, 2008 11:06 PM, Peter de Blanc <email@example.com> wrote:
> On Tue, 2008-01-29 at 22:14 -0500, Thomas McCabe wrote:
> > > > "* We might live in a computer simulation and it might be too
> > > > computationally expensive for our simulators to simulate our world
> > > > post-Singularity.
> > > > o Rebuttal synopsis: This scenario can be used to argue
> > for,
> > > > or against, any idea whatsoever. For idea X, just say "What if the
> > > > simulators killed us if we did X?", or "What if the simulators
> > killed
> > > > us if we didn't do X?". "
> > >
> > > This is not a rebuttal. Just because an idea can be misused to argue
> > for
> > > all sorts of things does not make it false (consider evolution,
> > quantum
> > > mechanics).
> > An idea which can argue for absolutely *anything* must have zero
> > information content. See
> > http://www.overcomingbias.com/2007/08/your-strength-a.html.
> Hypotheses don't argue. People argue. Hypotheses generate probability
> distributions which can be revealed by careful analysis.
> You ignored my example:
> > Hypothesis 1: We are in a computer simulation, and it will be shut
> > if it becomes much more computationally expensive.
> > Hypothesis 2: We are in a computer simulation, and it will be shut
> > _unless_ it becomes much more computationally expensive.
> > Is hypothesis 2 exactly as plausible as hypothesis 1? I would say it's
> > much less plausible.
Did you miss the quote from "Technical Explanation"? Essentially, the
intuitive notion of how 'plausible' something is doesn't correspond
well to an actual probability distribution. Because we have no
knowledge whatsoever about the rules governing the simulation (other
than the ones we can observe directly), to estimate the probability of
a rule, you need to use Solomonoff induction or some approximation to
it. If someone did math saying "Hey, a set of rules which leads to our
imminent doom has much less complexity than a set of rules which lets
us keep going", I'd be willing to revisit the simulation argument. As
it is, I seriously doubt this; throwing in an additional conditional
(if: we go through the Singularity, then: shut down the simulation)
seems likely to add complexity, not remove it. The reverse conditional
(if: we don't go through the Singularity, then: shut down the
simulation) is simply a negation of the first one, so it seems likely
to have similar complexity. "Seems likely" is obviously an imprecise
statement; anyone have any numbers?
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:01 MDT