Re: answers I'd like from an SI

From: Matt Mahoney (matmahoney@yahoo.com)
Date: Mon Nov 12 2007 - 12:41:56 MST


--- Norman Noman <overturnedchair@gmail.com> wrote:

> On Nov 11, 2007 8:48 PM, Matt Mahoney <matmahoney@yahoo.com> wrote:
> > > what if someone from 17000 BCE asked you how fire works?
> > > Would you say "I know the answer, but I am not able to communicate it to
> > > you"?
> >
> > Not the same. An SI trying to communicate with a human would be like a
> human
> > trying to communicate with an insect.
>
> That's just stupid. The smarter you are, the better you are at
> explaining anything to anyone. If an explanation exists which a human
> would understand, a superintelligence should be able to find it. It
> should be able to explain general relativity to an eight year old so
> well that their intuitive understanding of the skewing of reference
> frames is better than stephen hawking's.

Ability to communicate is limited by the intelligence of both sender and
receiver. We can observe the dancing of bees as they tell others in the hive
where the pollen is. We can also create dancing robot bees that signal the
bees where to fly. But that is about all we can do. Their language is quite
limited. They can't ask where to build a hive that would best ensure their
survival. They don't know the difference between a beekeeper and an
exterminator.

A superintelligence would know the complete state of your brain. You would
not need to ask it anything. It would already know your questions. It it
chooses to, it could answer by reprogramming your brain. But we can know
nothing about its intentions or motives.

-- Matt Mahoney, matmahoney@yahoo.com



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:00 MDT