The logistic map is a one dimensional discrete-time non-linear system, that lends itself to being used as information source of known entropy (c.f. Schuster, "Introduction to non-linear dynamics"). There is a great deal in the literature describing the behaviour of the logisitic map so we shan't repeat all the details here. Just some.

The following figure shows the iterates of the logistic map as a function of the control parameter r. (See also the T-entropy views at bottom of page) Superposed on this is a plot of the Lyapunov exponent computed at increments of .0001. The significance of the Lyapunov exponents is its characterization of the dynamics. Essentially a negative value is indicative of stable periodic solutions to the recurrence relation, whereas positive values are indicative of unpredictable. unstable dynamics. The larger the Lyapunov exponent, the more unpredictable the system. In d-dimensional phase space there will be d exponents, one for each degree of freedom.

Kolmogorov
recognised (1958) that the phase space trajectory of a dynamical system
could be partitioned into volume elements and by suitably labelling each
element one could thus symbolically encode the trajectory. By further associating
with each volume element visitation probabilities he proposed that one
could in principle if not in practice compute the corresponding Shannon
entropy for the system dynamics. The optimum partition, maximising the
Shannon entropy fr the system is referred to as the Kolmogorov-Sinai entropy.
It is well known that certain coarse grained partitions of the logistic
map state space will yield finite alphabetic encodings of the dynamics
that correspond to the KS-entropy. One such "generating" bi-partition,
i.e. yielding binary encodings of the dynamics, exists at p=0.5. That is
if at time t the state x_{t} * p we encode the symbol s_{t}
= '1' else x_{t} < p encode s_{t} = '0'.

Pesin proved (1977) that for certain classes of system including the one dimensional maps, like the logisitic map, henon map and tent map, the KS-entropy is equal to the sum of positive Lyapunov exponents. For one dimensional systems only a single exponent exists for each. For the Tent map the Lyapunov exponent has been computed in closed form (c.f. Schuster). For the Logistic map, standard methods can be employed to compute the exponent with good precision, from the observed time series. The above graph depicts in green the Lyapunov exponent for the logistic map (courtesy R. Steuer).

Whereas the evaluation of the Shannon entropy from a string of symbols ordinarily requires á priori knowledge of the source distribution, one may for the binary encodings of the logistic map dynamics, according to Pesin's identity use the positive Lyapunov exponent to give the entropy of the source. Thus the shannon entropy of the binary strings may be infered indirectly by way of pesin's identity.

On the other
hand, Deterministic -IT allows us to also compute from the string encodings,
a quantity which we refer to as the T-entropy. As illustrated in the following
graph, it is clear that the T-entropy closely tracks the positve Lyapunov
exponent over the whole of the range of possible entropies 0 - ln(2) (*nats*)
(i.e. corresponding to normalised entropies 0 - 1 *bits/bit*). In
contrast to Shannon's entropy, the T-entropy is computed directly from
a finite length sample string without reference to the source symbol probabilities.
By using the logistic map to 'calibrate' this alternative measure we may
apply the measure to the braoder range of applications where information
measurement is desirable.

In essence we may deduce from a string the equivalent Shannon source entropy for which the string might be considered typical. Moreover, and as we shall subsequently demonstaret, we may further confirm stationarity or otherwise of the source, without resorting to statistical techniques. There are a number of important aspects to appying the measure generally which we consider here.

The graph below
depicts for the logisitic map: i) the Lyapunov exponent (green) , and ii)
T-entropy values (red) computed from sample strings of 10Mbits each and
encoded from the generating bi-partition described earlier. The information
rate (entropy = positve Lyapunov exponent) of the logisitic map varies
from 0 through to the maximum possible 1 (normalised entropy scale) (y-axis)
as a function of a single control parameter r, ranging from 3.5 - 4 (x-axis).
The relationship between the entropy and r is in fact complex and
illustrated here in simplified form for values of r, at intervals of .001.
In the absence of a formal proof, the graph confirms implicitly that the
T-entropy is measuring essentially a quantity that corresponds closely
to the Shannon-entropy of the corresponding source range for which the
sample string may be considered typical and it does this across the
whole .

Lyapunov exponent
(green) and T-entropy (red) computed

from 10Mbit
long string encodings about bi-partition (c = 0.5).

This graph
computed from over 4000 10Mbit files in a matter of hours!

(Not using
"tcalc" since this runs only as O(n^2), but using a new implementation
"qtcalc", by Yang Jia ... runs as O(n)! )

The next graph
plots the T-entropy as a function of the state space partition parameter.
This essentially confirmes that the entropy

is maximal.
irrespective of the value of r, for the partition equal to the critical
value c = 0.5.

A shell script for generating binary encodings of the logisitc map trajectory

Copy the following into a file called "lgst" say. Make the file executable by typing: chmod +x lgst

This simple script doesn't check the validity of your arguments!Typing the command

./lgst 3.7 0.65 10000 > myfilewill generate a string of 10000 bits into the file myfile. The script is not fast, but has the advantage that the scale (resolution) of the computation may be set by the user. Its Shannon entropy will correspond to the positive Lyapunov exponent at 3.7.

(You may use tcalc to compute its t-entropy. To interpret the results of this see using tcalc )#!/bin/sh

# the logistic map is a one dimensional discrete time non-linear dynamical system

# the state of the system x, at time t, is given by the recurrence relation:

# x_{t} = r x_{t-1} (1 -x_{t-1})

RVAR=$1 # biotic potential, r, should be in range (1,4)

INITVAR=$2 # initial state should be in interval (0,1), e.g.: 0.65

LENGVAR=$3

echo "scale=30; r=$RVAR; x=$INITVAR; for(i=0;i< $LENGVAR;i++) {x = r*x*(1-x); x>.5}" | bc | tr -d "\n"

Logistic map as an information source: A standard corpus of files for testing purposes.

The set of 50 test files derived from the logistic map are available for test purposes. Note that these are gziped - packed ASCII coded binary files comprising 10,000,000 bits.

The naming of the files is FILE# _ r-value _ T-entropy(Shannon normalised value of the entropy)

A further set
of 100 files is provided here in gzipped and packed
form:

The naming
of these files is FILE# _ r-value _ T-entropy(Shannon normalised
value of the entropy).pk.gz

If gunzip the files you need to further unpack the 8-bits per byte to get the individual bits of the 10,000,000 bits strings.

Note that the information content of the packed string is very slightly different from the unpacked string .... something that needs to be taken into account if you are going to treat these as byte character files. The logistic map because of the period doubling ends up with cycles that may reduce the byte-character alphabet size to much less than the 256 possible characters that would ordinarily exist in a variety of strings.

Papers to read...

Relevant papers are available here:

"Compressor
performance, absolutely!" Titchener 2002

postscript,
PDF

(This paper
uses the above corpora to effectively analyse the absolute performance
of a number of popular compressors)

"Partition
based entropies of Deterministic and Stochastic maps"

Ebeling, Steuer, Titchener, 2001 postscript,
PDF

"Deterministic
Chaos and Information Theory",

Titchener ebeling 2001 postscript, PDF

"A measure
of information", Titchener 1999

postscript,
PDF

"Towards a
calibrated Corpus for Compression testing"

Titchener, Fenwick, Chen, 1998 postscript,
PDF

Gallery of entropy pics.

The following
images are of entropy surfaces computed for the logistic map using coarse
graining in order to symbolically encode sampled time series, from which
the T-entropy for the data is computed as a function of the bi-partition
and control parameter, r. The graphcs is generated using a propriatory
virtual 3D package written in C, and based on openGL calls.

View from above of the entropy surface computed for the logistic map.