Re: answers I'd like from an SI

From: Matt Mahoney (matmahoney@yahoo.com)
Date: Mon Nov 12 2007 - 15:51:18 MST


--- Norman Noman <overturnedchair@gmail.com> wrote:
> > A superintelligence would know the complete state of your brain. You
> would
> > not need to ask it anything. It would already know your questions. It it
> > chooses to, it could answer by reprogramming your brain.
>
> What does this have to do with anything?

The bee that responds to the dancing robot does not know it is communicating.
If an SI reprograms your brain, you would not know it either.

How would you explain the difference between a beekeeper and an exterminator
to a bee? It does not matter how smart *you* are. You could use
nanotechnology to augment the bee's brain with human level intelligence, but
then it wouldn't be a bee anymore.

> > But we can know nothing about its intentions or motives.
>
> Except if we defined them,

You can program the first AGI to be friendly, and you can program it to
program the second one to be friendly, and you can program it to program it to
program the third one to be friendly, but you or somebody else is going to
eventually get it wrong, and evolutionary pressure will take over.

> or if it told us,

Which is hard to do if you don't know it is there.

> or if we observed the results of its actions?

How do you know you are not, right now?

> Kind of like everything else?

It is like nothing else.

-- Matt Mahoney, matmahoney@yahoo.com



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:00 MDT