Re: ESSAY: How to deter a rogue AI by using your first-mover advantage

From: Aleksei Riikonen (
Date: Thu Aug 23 2007 - 11:16:32 MDT

On 8/23/07, Mika Letonsaari <> wrote:
> And why should we be interested in AI which gives so small a
> probability that it doesn't act accordingly? We are interested to make
> AI also behave so that it doesn't eat us. Which is to create a bogus
> myth about there being someone who can punish wrong-doers. And that's
> pretty much like religion to me (well, religion is organized community
> sharing belief, so it isn't actually, but just a thought).

You are having difficulty understanding that a very small probability
of a certain kind of proposition being true is in many circumstances
*enough* to make certain kinds of AI decide not to do anything too
nasty, if it doesn't cost the AI very much to refrain from the
nastiness. You seem to intuitively think that only not-extremely-small
probabilities affect rational decision making, but that is incorrect.
In some cases small probabilities matter.

There are no "bogus myths" being created, but trying to manipulate the
situation so that it becomes a *fact* that of all the philosophically
possible worlds which are identical from the AIs point of view (and
include an identical AI), a large enough portion are simulations of
such a kind that in those worlds the AI would not achieve its goals if
it got too nasty. (There might be slight inaccuracies in some details
of this description.)

Have you read and understood Bostrom's original Simulation Argument?
You do need to be somewhat familiar with anthropic reasoning to not be
hopelessly confused on the topic being discussed here.

>> You seem to have difficulty understanding that assigning a non-zero
>> probability to something is a different thing than "believing" in it.
>> Religiosity tends to be about assigning probabilities of 1 to silly
>> stuff. Here we are talking about assigning a potentially very small
>> but still non-zero probability to the fact that one is living in a
>> simulation, and implications of such a probability assignment.
> When I've asked my religious friends, they usually say they can't be
> sure if the Bible for example is true or not. They say it might be
> small possibility, but it's better be safe than sorry (and being
> religious doesn't carry much cost in some social groups, it can be
> even benefit locally). So they assign a low probability too.

Christianity, for example, by definition includes certain dogmas which
you either accept, or you are not a christian. If a person assigns a
low probability to them (which means that they think the dogmas are
probably false), they are not a christian. They may hang out with
actual christians for social reasons, of course. They may also be
quite confused about what they actually believe and what not.

(And if it were just about being safe instead of sorry, how did they
end up "believing" in the Bible but not the holy books of all the
other religions?)

Aleksei Riikonen -

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:58 MDT