From: Aaron McBride (amcbride@jps.net)
Date: Tue Jun 25 2002 - 23:21:07 MDT
At 08:24 PM 6/25/2002 -0700, you wrote:
<clip>
>>Let's say you're 90% (+-5%) sure that the answer is: 1000. Would you
>>still take the time to consult with someone else? With ten others? What
>>if 1 out of the 10 you consult says the answer is 990? Would you still
>>get 20 more?*
>>
>>It is rational to check with others to see if your theories of
>>Friendliness, but it doesn't make sense to not move forward just because
>>someone disagrees.
>
>Wrong, dead wrong. For one thing, why bother to check with others if you
>just plan to ignore them anyway? It is irrational to proceed unless there
>is no other choice. ABSOLUTELY ANYTHING that could lead to the extinction
>of the human race MUST BE checked, rechecked, checked again, etc. To the
>maximum extent possible. A 10% risk of extinction is way, way too
>high. Unless, of course, the result of not acting IMMEDIATELY became
>equal to or worse than the result of failure. Maybe at 0.01% chance of
>extinction I might back off my stance if a very significant number of
>deaths were occurring continuously.
>
>James Higgins
>
Fair enough. I should have been more clear. When asking a large enough
audience about a topic such as designed Friendliness, where we have no
experience, we can expect people to disagree. We can't absolutely reject a
plan just because someone disagrees, they have to have a valid (checkable)
reason for disagreeing.
When you say "A 10% risk of extinction...", are you talking about a
confidence level (aka '"I'm I feel about 90% sure that we'll come out of
this alive."), or are you suggesting that we'll have some way to measure
the true risk level. If it's the latter, I'd love to hear some suggestions
on how we could accurately predict the percent risk a given Seed AI poses.
finally:
I agree that we must maximize our confidence in the Friendliness affinity
of any proposed Seed AI.
We can then confidently move forward once we reach the point where the risk
of extinction due to execution of Seed AI equals the risk of extinction due
to not acting. (My own opinion is that the risk of extinction by not
acting is on the rise, so we'll probably see the crossover happen at
something > 1%, possibly as high as 5%.)
-Aaron
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT