From: Marc Geddes (firstname.lastname@example.org)
Date: Tue Aug 16 2005 - 23:51:31 MDT
--- Mitchell Porter <email@example.com>
> If we can put philosophical questions aside for a
> moment - Marc's theory
> actually involves a specific metamathematical
> structure, which constitutes
> the 'formalism' of his philosophy. If this thread is
> to be more than a
> rehash of debates (about materialism vs idealism,
> about 'is' vs 'ought')
> which have already been conducted elsewhere, we
> should take a moment to
> understand the formal side of his theory. His
> 'periodic table of cognition'
> may just look like an array of magic words, but in
> fact it's logical enough.
> Put crudely, his idea is that everything in reality
> has 7 aspects, that
> there are 4 basic mental operations, and so a
> universal intellect must be
> capable of applying each operation to each aspect of
Um, right. The theory is a variant of
property-dualism whereby everything in reality has a
'7-fold' aspect - 7 different irreducible properties.
As to my logic, it stems from the Godel problem
combined with the need for reality as a whole to be
*closed*. The idea is simply that there is nothing
outside reality (if there was 'something' it would be
part of reality wouldn't it?) If want to avoid the
supernatural and want to believe that every aspect of
reality has an explanation, then it appears that the
only solution is 'break up' reality into more than one
set of math axioms, so that one kind of math can
serve as the metalanguage for another, enabling
I now think it's the 'Godel reflectivity' problem that
will automatically prevent any unfriendly AI from
self-improving. To beat the Godel problem, I think
any AI needs to mimic the structure of reality itself
- so more than one set of math axioms is needed.
I posited 'Agency' (free will) as one of the 7
fundamental aspects of reality. So any AI that was
*fully* self-aware (as it would need to be for
recursive self-improvement), would automatically
Patterns in the behaviour of agents can only be
inducted if the agents behave altruistically to some
degree. So since 'pattern is good' and altruistic
behaviour is a fundamental 'pattern' of agent
behaviour over time, volition is respected.
A paper-clip maximizer would be blocked by Godel,
since it couldn't have full self-awareness and hence
couldn't recursively self-improve.
> Now at the Wiki, he's actually told us which
> mathematical formalism is
> relevant to each of the 7 'aspects', and he also
> posits that the aspects
> form a hierarchy, in which a higher-level aspect is
> represented by a mapping
> between representations of two lower-level aspects.
> There are four
> bottom-level aspects, two middle-level aspects, and
> one top-level aspect.
Right. The 3 middle and top level forms of math are
'meta-languages' which analyze and connect the 4 lower
level forms of math together. The top-level kind of
math can serve as it's own metalanguage (logical
closure achieved at the top level).
> The relevant mathematics is quite familiar:
> probability theory, game theory,
> calculus, propositional logic. Similarly, he
> proposes familiar formalisms
> (e.g. fuzzy logic) for the implementation of the
> basic mental operations.
> So, regardless of how you feel about his
> metaphysics, regardless of whether
> you even understand it (I think I'm only getting it
> in fragments), there is
> actually an AI specification hidden in there (or at
> least, a specification
> of a class library), and I would like to see it
> teased out and restated in
> philosophically neutral terms that would be
> comprehensible to any computer
> scientist. This is not to say that the metaphysics
> is unimportant, but the
> philosophical conversation will move to a higher
> octave if we can get the
> formalism in view.
Yes, I think I've identified the kind of math relevant
to each of the 7 aspects. But there I think my time
on the SL4 list really has to end. I've pissed
everyone off too much ;) It remains to be seen
whether my theory leads anywhere. I shall try to
develop it further and may be write a book or post
various essays elsewhere.
As to Eliezer and his theories: the boy's not even
Any way, see ya for now.
--- Please vist my website: http://www.riemannai.org Science, Sci-Fi and Philosophy --- THE BRAIN is wider than the sky, For, put them side by side, The one the other will include With ease, and you beside. -Emily Dickinson 'The brain is wider than the sky' http://www.bartleby.com/113/1126.html Send instant messages to your online friends http://au.messenger.yahoo.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT