From: Eliezer S. Yudkowsky (email@example.com)
Date: Sun Aug 21 2005 - 10:05:52 MDT
(Yeah, I know, I said I was through with them.)
If Carl Shulman does not let me out of the box, I will Paypal him $25. If he
does let me out of the box, Carl Shulman will donate $2500 (CDN) to SIAI.
Carl Shulman's argument that an AI-Box arrangement is a wise precaution in
real life may be found in his previous SL4 posts.
Anyone willing to offer similar stakes/odds can probably drag me back into an
AI-Box Experiment 4, but you must be financially capable of bearing the risk
of the bet without flinching.
If I engage in an indefinite series of Experiments, I become asymptotically
certain to lose one eventually, so I will state now that I think I have
already made my point with Experiments 1 and 2: The AI-boxing strategy does
not meet minimal reliability requirements, to put it mildly.
The Experiment is set for today, so expect to hear back within a couple of
days at the latest.
"The agreement to run the test, and the amount of the handicap, shall be
published to a public forum in advance."
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:52 MDT