From: William Pearson (firstname.lastname@example.org)
Date: Sun May 04 2008 - 01:46:08 MDT
2008/5/2 Matt Mahoney <email@example.com>:
> --- William Pearson <firstname.lastname@example.org> wrote:
> > 2008/5/1 Matt Mahoney <email@example.com>:
> > > If two symbiotic agents with unequally sized saturated memories
> > > communicate, then both agents must change state at the same rate,
> > > as measured by conditional algorithmic complexity.
> > If and only if they send each other no redundant or useless
> > information. Which is unlikely even between subsystems of the same
> > agent.
> I accounted for both in my analysis. In general, a message will be
> longer than the amount of state change in the receiver. We may regard
> any information from sender A that is ignored by the receiver B as
> having not been communicated. If the remainder is x, then in general
> |x| >= K(x) >= K(x|B(t1)) = K(B(t2)|B(t1)), x may be more redundant to
> B than any intrinsic redundancy in x itself.
> The most efficient distribution of information in A and B is the one
> that minimizes the redundancy due to shared knowledge,
> K(A)+K(B)-K(A,B). If A and B cooperate then they can achieve this by
> communicating x whenever B can store it more efficiently than A, i.e.
> K(x|B(t1)) < K(x|A(t2)).
If you have *no* shared knowledge communication must be very verbose,
as you cannot compress things. Most efficient depends upon what you
are trying to achieve.
Other reasons to share information are backup and caching for speed.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT