From: justin corwin (firstname.lastname@example.org)
Date: Fri Jan 27 2006 - 13:58:54 MST
On 1/27/06, George Dvorsky <email@example.com> wrote:
> Eliezer, refusal to answer the question on the grounds that "this
> scenario ain't gonna happen" seems strange to me. Extreme improbability
> does not equate to impossibility, and given that we likely live in an
> infinite universe such a scenario must play itself out for some
> observers. Yes, you're right in suggesting that it's highly unlikely
> that such a scenario will be witnessed by the vast majority of
> observers, but that does not preclude its occurrence outright, and thus
> the question should be confronted.
This to me is exactly NOT what I thought when I read Eliezer's post.
In my opinion, he is pointing out a very valid problem, but not the
one you seem to be objecting to. In being a good thinker, we need to
become very good at discerning the true nature of the facts we have
available(or at least some approximation of sureness) and then operate
on the assumption that we've done our diligence.
However, when people ask hypothetical questions, they are asking you
to take certain facts as axiomatic, and then reason AS IF you were in
a normal situation that those axiomatic facts had somehow been imposed
upon. Many times I refuse to answer hypothetical questions on these
grounds, because the reasoning very quickly becomes artificial.
"If good was bad, and hitler was going to kill ghandi, what would you
do? answer using the moral logic you have developed in an entirely
Hypothetical questions twist your reasoning skills in knots, often in
very subtle ways, because they mess up how evidence and environment
act as inputs to your decisions, but are presented as if they are
'realistic' or approximating realism.
You can see this in extremis when you watch sci-fi fans trying to
'explain' the continuity of their favorite show. Given some
unambigious wierdness they have to take as a starting point, they
twist themselves into knots trying to make it 'make sense'. You get
ridiculousness like www.stardestroyer.net, where a poor engineer has
tried to actually analytically determine the nature of imaginary
> On a related note, the odds of our surviving the Singularity is likely
> akin to the coin flipping scenario offered above. Would you argue that
> we shouldn't anticipate our surviving the Singularity due to its gross
Well, yes. If you actually believe that our likelihood of surviving is
very low, then you should anticipate whatever scenario is most likely
and plan for that. If all the likely scenarios are entirely beyond any
action you may take, then you can plan for less likely scenarios on
the reasoning that it's the best leverage you have, but it seems
unlikely that you could determine that.
-- Justin Corwin firstname.lastname@example.org http://outlawpoet.blogspot.com http://www.adaptiveai.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT