LOGISTIC MAP: a simple one dimensional non-linear system provides a useful connection with Information Theory.

The logistic map is a one dimensional discrete-time non-linear system, that lends itself to being used as information source of ‘known’ entropy (c.f. Schuster, "Introduction to non-linear dynamics"). There is a great deal in the literature describing the behaviour of the logisitic map so we shan't repeat all the details here. Just the essence of what is important.

The figure below shows the iterates of the logistic map as a function of the control parameter r. Superposed on this is a plot of the Lyapunov exponent computed at increments ∆r = .0001 .

The significance of the Lyapunov exponents is its characterization of the dynamics. Essentially a negative value is indicative of stable periodic solutions, whereas positive values are indicative of unpredictable, unstable dynamics. The larger the Lyapunov exponent, the more unpredictable the system. In d-dimensional phase space there will be d Lyapunov exponents, one for each degree of freedom. The logistic map is a one dimensional system so it only has one exponent.

The phase space trajectory of a dynamical system may be partitioned into volume elements and by suitably labelling each element one may thus symbolically encode the trajectory. By further associating with each volume element visitation probabilities one may in principle, if not in practice, compute the corresponding Shannon entropy for the system dynamics. The optimum partition, maximising the Shannon entropy for the system is referred to as the Kolmogorov-Sinai entropy. It is well known that certain coarse grained  partitions of the logistic map state space will yield finite alphabetic encodings of the dynamics that correspond to the KS-entropy. One such "generating" bi-partition, i.e. yielding binary encodings of the dynamics, exists at p=0.5. That is if at time t the state xt * p we encode the symbol st = '1' else xt < p encode st = '0'.

Pesin proved (1977) that for certain classes of system including the one dimensional maps, like the logisitic map, Henon map and tent map, the KS-entropy is equal to the sum of positive Lyapunov exponents. For one dimensional systems only a single exponent exists for each. For the Tent map the Lyapunov exponent has been computed in closed form (c.f. Schuster). For the Logistic map, standard methods can be employed to compute the exponent with good precision, from the observed time series. The above graph depicts in green the Lyapunov exponent for the logistic map (courtesy R. Steuer).

Whereas the evaluation of the Shannon entropy from a string of symbols ordinarily requires á priori knowledge of the source distribution, one may for the binary encodings of the logistic map dynamics, according to Pesin's identity use the positive Lyapunov exponent to give the entropy of the source. Thus the shannon entropy of the binary strings may be infered indirectly by way of pesin's identity.

On the other hand, Deterministic -IT allows us to also compute from the string encodings, a quantity which we refer to as the T-entropy. As illustrated in the following graph, it is clear that the T-entropy closely tracks the positive Lyapunov exponent over the whole of the range of possible entropies 0 - ln(2) (nats)  (i.e. corresponding to normalised entropies 0 - 1 bits/bit). In contrast to Shannon's entropy, the T-entropy is computed directly from a finite length sample string without reference to the source symbol probabilities. By using the logistic map to 'calibrate' this alternative measure we may apply the measure to the broader range of applications where information measurement is desirable.

In essence we may deduce from a string the equivalent Shannon source entropy for which the string might be considered typical. Moreover, and as we shall subsequently demonstrate, we may further confirm stationarity or otherwise of the source, without resorting to statistical techniques. There are a number of important aspects to applying the measure generally which we consider here.

The graph below depicts for the logistic map: i) the Lyapunov exponent (red) , and ii)  T-entropy values (black) computed from sample strings of 100,000 bits each and encoded from the generating bi-partition described earlier. The information rate (entropy = positive Lyapunov exponent) of the logistic map varies from 0 through to the maximum possible, ln(2) (y-axis) as a function of a single control parameter r, ranging from 3.5 - 4 (x-axis). The relationship between the entropy and r is in fact  complex and illustrated here in simplified form for of values of r, at intervals of .001. In the absence of a formal proof, the graph confirms implicitly that the T-entropy is essentially measuring a quantity that corresponds closely to the Shannon-entropy of the corresponding source for which the sample string may be considered typical.

Lyapunov exponent (blue) and T-entropy (magenta) computed from 100,000 bit long string encodings, using bi-partition (c = 0.5).

The next graph plots the T-entropy as a function of the state space partition parameter. This essentially confirms that the entropy is maximal. irrespective of the value of r, for the partition equal to the critical value c = 0.5.








A shell script for generating binary encodings of the logistic map trajectory

Copy the following into a file called "lgst" say.  Make the file executable by typing: chmod +x lgst

This simple script doesn't check the validity of your arguments!
Typing the command
./lgst 3.7 0.65 10000 > myfile

will generate a string of 10000 bits into the file myfile. The script is not fast, but has the advantage that the scale (resolution) of the computation may be set by the user. Its Shannon entropy will correspond to the positive Lyapunov exponent at 3.7.
(You may use tcalc to compute its t-entropy. To interpret the results of this see using tcalc )

#!/bin/sh 
# the logistic map is a one dimensional discrete time non-linear dynamical system 
# the state of the system x, at time t, is given by the recurrence relation: 
#  x_{t} = r x_{t-1} (1 -x_{t-1}) 
RVAR=$1        # biotic potential, r, should be in range (1,4) 
IVAR=$2     # initial state should be in interval (0,1), e.g.: 0.65 
LVAR=$3 
echo "scale=30; r=$RVAR; x=$IVAR; for(i=0;i< $LVAR;i++) {x = r*x*(1-x); x>.5}" | bc | tr -d "\n"

LOGISTIC MAP AS AN INFORMATION SOURCE: 
A standard corpora of files for testing purposes.
The first set of ten+ten files are derived from the logistic map, and prefixed by "lgst" accordingly. The digits, e.g. 3.573550, correspond to the system parameter 'r' used to generate the binary string from the generating bi-partition,  p = 0.5 . You may use the above script to generate your own strings using the given values of r.

Raw ASCII coded (gzipped)  binary files 
(files contain 2.000,000 bits):
1. lgst3.573550.gz    H =  0.1  (.0996) 
2. lgst3.586787.gz    H =  0.2  (.1997) 
3. lgst3.611055.gz    H =  0.3  (.2972) 
4. lgst3.651050.gz    H =  0.4  (.3992) 
5. lgst3.687660.gz    H =  0.5  (.5032) 
6. lgst3.766200.gz    H =  0.6  (.5982) 
7. lgst3.907580.gz    H =  0.7  (.6982) 
8. lgst3.925405.gz    H =  0.8  (.7999) 
9. lgst3.971029.gz    H =  0.9  (.8942) 
10. lgst4.000000.gz    H =  1.0  (1.000) 
 
Packed (gzipped)  fILES 
(8 bits/byte, i.e. 250,000 bytes): 

1. lgst3.573550p.gz    H =  0.1  (.0943) 
2. lgst3.586787p.gz    H =  0.2  (.1882) 
3. lgst3.611055p.gz    H =  0.3  (.2971) 
4. lgst3.651050p.gz    H =  0.4  (.3855) 
5. lgst3.687660p.gz    H =  0.5  (.5010) 
6. lgst3.766200p.gz    H =  0.6  (.5885) 
7. lgst3.907580p.gz    H =  0.7  (.6799) 
8. lgst3.925405p.gz    H =  0.8  (.7769) 
9. lgst3.971029p.gz    H =  0.9  (.8550) 
10. lgst4.000000p.gz    H = 1.0  (1.000) 

PAPERS TO READ...
"Compressor performance, absolutely!" Titchener 2002  postscript, PDF 
(This paper uses the above corpora to effectively analyse the absolute performance of  a number of popular compressors)
"Partition based entropies of  Deterministic and Stochastic maps" 
    Ebeling, Steuer, Titchener, 2001  postscript, PDF
"Deterministic Chaos and Information Theory", 
       Titchener  ebeling 2001 postscript, PDF
"A measure of information", Titchener 1999   postscript, PDF
"Towards a calibrated Corpus for Compression testing" 
    Titchener, Fenwick, Chen, 1998  postscript, PDF 
 
http://tcode.tcs.auckland.ac.nz/~corpus/lgst3.573550.gzhttp://tcode.tcs.auckland.ac.nz/~corpus/lgst3.586787.gzhttp://tcode.tcs.auckland.ac.nz/~corpus/lgst3.611055.gzhttp://tcode.tcs.auckland.ac.nz/~corpus/lgst3.65105.gzhttp://tcode.tcs.auckland.ac.nz/~corpus/lgst3.687660.gzhttp://tcode.tcs.auckland.ac.nz/~corpus/lgst3.766200.gzhttp://tcode.tcs.auckland.ac.nz/~corpus/lgst3.907580.gzhttp://tcode.tcs.auckland.ac.nz/~corpus/lgst3.925405.gzhttp://tcode.tcs.auckland.ac.nz/~corpus/lgst3.971029.gzhttp://tcode.tcs.auckland.ac.nz/~corpus/lgst4.000000.gzhttp://tcode.tcs.auckland.ac.nz/~corpus/lgst3.573550p.gzhttp://tcode.tcs.auckland.ac.nz/~corpus/lgst3.586787p.gzhttp://tcode.tcs.auckland.ac.nz/~corpus/lgst3.611055p.gzhttp://tcode.tcs.auckland.ac.nz/~corpus/lgst3.651050p.gzhttp://tcode.tcs.auckland.ac.nz/~corpus/lgst3.687660p.gzhttp://tcode.tcs.auckland.ac.nz/~corpus/lgst3.766200p.gzhttp://tcode.tcs.auckland.ac.nz/~corpus/lgst3.907580p.gzhttp://tcode.tcs.auckland.ac.nz/~corpus/lgst3.925405p.gzhttp://tcode.tcs.auckland.ac.nz/~corpus/lgst3.971029p.gzhttp://tcode.tcs.auckland.ac.nz/~corpus/lgst4.000000p.gzshapeimage_1_link_0shapeimage_1_link_1shapeimage_1_link_2shapeimage_1_link_3shapeimage_1_link_4shapeimage_1_link_5shapeimage_1_link_6shapeimage_1_link_7shapeimage_1_link_8shapeimage_1_link_9shapeimage_1_link_10shapeimage_1_link_11shapeimage_1_link_12shapeimage_1_link_13shapeimage_1_link_14shapeimage_1_link_15shapeimage_1_link_16shapeimage_1_link_17shapeimage_1_link_18shapeimage_1_link_19
Non-Linear Dynamics  & Information Theory