Tuesday, August 05, 2008

Complexity-Entropy Diagrams

While I was in China, my collaborators and I finally finished a paper and submitted it to the journal Chaos and posted it to the arXiv. We started this paper in 2002. There's no good reason that it took this long to finish. Some of the numerics were a little tricky, and it took some thought to figure out the best way to present some of the results. But mostly other responsibilities kept getting in the way of finishing the manuscript.

But now it's done, and I'm quite happy with it. In the "micro-field" of measures of physical comnplexity, there has been much discussion of how complexity and entropy might be related. In this paper, rather than discuss things in abstract terms, we took one well understood complexity measure, the excess entropy, and calculated it for a wide range of model systems. (The excess entropy is also known as the effective measure complexity and the predictive information.) The product of these calculations is a set of results showing a range of possible complexity vs. entropy behaviors. I hope that this paper puts to rest claims of a universal complexity-entropy curve and the notion that complexity must always be sharply maximized at intermediate entropy. (Actually, I know that this paper won't even come close to accomplishing this. But this nevertheless remains my hope.)

More generally, I think this paper implicitly suggests that there are complexity measures which are well known and well understood, and so researchers should stop trying to devise new complexity measures, unless there is some compelling reason to do so. (And most such "new complexity measure" papers that I've seen recently don't offer reasons I find compelling.) Also, I hope that this paper prompts others to carry out some calculations for the excess entropy or other complexity measures. The excess entropy isn't that difficult to calculate and is clear and straightforward to interpret.

In any event, the abstract and a link to the full paper on the arXiv follow:

David P. Feldman, Carl S. McTague, James P. Crutchfield, The Organization of Intrinsic Computation: Complexity-Entropy Diagrams and the Diversity of Natural Information Processing.
Intrinsic computation refers to how dynamical systems store, structure, and transform historical and spatial information. By graphing a measure of structural complexity against a measure of randomness, complexity-entropy diagrams display the range and different kinds of intrinsic computation across an entire class of system. Here, we use complexity-entropy diagrams to analyze intrinsic computation in a broad array of deterministic nonlinear and linear stochastic processes, including maps of the interval, cellular automata and Ising spin systems in one and two dimensions, Markov chains, and probabilistic minimal finite-state machines. Since complexity-entropy diagrams are a function only of observed configurations, they can be used to compare systems without reference to system coordinates or parameters. It has been known for some time that in special cases complexity-entropy diagrams reveal that high degrees of information processing are associated with phase transitions in the underlying process space, the so-called ``edge of chaos''. Generally, though, complexity-entropy diagrams differ substantially in character, demonstrating a genuine diversity of distinct kinds of intrinsic computation.

No comments: