Re: JQ Test 1.0

From: H C (lphege@hotmail.com)
Date: Fri Jan 27 2006 - 22:30:57 MST


>From: "Eliezer S. Yudkowsky" <sentience@pobox.com>
>Reply-To: sl4@sl4.org
>To: sl4@sl4.org
>Subject: Re: JQ Test 1.0
>Date: Fri, 27 Jan 2006 11:30:08 -0800
>
>H C wrote:
>>
>>4. Given a coin is flipped 10 times
>> HHHHHHHHHH
>>
>>What would you guess your odds of hitting tails are next round?
>
>This question is philosophically interesting, in that the given sequence of
>heads was not actually generated by a fair coin, but by an author who was
>deliberately choosing heads. I think it is legitimate to refuse to answer
>questions predicated on large improbabilities unless and until the scenario
>comes up in real life.
>
>Given that a fair coin is flipped 50 times and comes up
> HHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHH
>what would you guess as your odds of getting tails next round?
>
>The answer, of course is:
> On the given hypothesis, this scenario ain't gonna happen.

Ok, I agree with your response in principle. Well, not exactly in principle,
but rather in real life.

If such a proposition actually came up in real life:

>Given that a fair coin is flipped 50 times and comes up
> HHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHH
>what would you guess as your odds of getting tails next round?

Then I would tell them to go f*ck themselves.

But regardless, since we *are* talking about a real coin with a real 50/50
chance of hitting heads or tails, and the string of heads is not beyond
reasonable assumption, then the answer is pretty straightforward.

--
"Without careful specification, "what ifs" can be used to justify anything"
It's funny ... that statement really clicked with me. It reminded me of my 
(day 4... sigh...) symbolic logic class, the teaching assistant spent *over 
an hour* trying to explain to the class why any given set of inconsistent 
premises always preserves truth...
ie.
Fred is taller than Ted. Ted is taller than Ed. Ed is taller than Fred.
Therefore, God exists.
is a valid argument.
*bangs face into table repeatedly at reaction of students*
>
>--
>Eliezer S. Yudkowsky                          http://intelligence.org/
>Research Fellow, Singularity Institute for Artificial Intelligence
>On a related note, the odds of our surviving the Singularity is likely
>akin to the coin flipping scenario offered above. Would you argue that
>we shouldn't anticipate our surviving the Singularity due to its gross
>improbability?
>.<
So yeah man, apparently you calculated the "probability" of our surviving 
the Singularity.
I mean, subjective induction has a power to it, but it's power is also 
vastly superficial if you are integrating over a superwide distribution of 
unknown causal avenues.
Your confidence in our survivability or unsurvivability of a Singularity 
should be, for all effective purposes, ZERO.
For now.
-hegem0n


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT