next up previous contents
Next: The embedding theorem Up: THE CHARACTERIZATION OF CHAOTIC Previous: The generalized dimensions   Contents

The generalized entropies

The Kolmogorov entropy [Kolmogorov, 1959] is a measure for the rate at which information about the state of the system is lost in the course of time. It is defined as follows [Schuster, 1988]. Suppose again that the $d$-dimensional phase space is partitioned into boxes of size $r^{d}$. Let $p_{i_{0} \ldots i_{d-1}}$ be the joint probability that $X(t=0)$ is in box $i_{0}$, ..., and $X(t=(d-1) \Delta t)$ is in box $i_{d-1}$. Then
\begin{displaymath}
K = - \lim_{\Delta t \rightarrow 0} \lim_{r \rightarrow 0}
...
..._{d-1}}
p_{i_{0} \ldots i_{d-1}} \ln p_{i_{0} \ldots i_{d-1}}
\end{displaymath} (2.4)

where $\Delta t$ is the time interval between measurements on the state of the system; for maps, $\tau=1$ and the limit $\tau \rightarrow 0$ is omitted. The entropy can be used to classify dynamical systems, since $K$ is zero for regular motion, it is infinite in random systems, but it is finite and positive in chaotic systems, see [Schuster, 1988, p.112]. The definition has been generalized to the spectrum $K_{q}$ [Schuster, 1988,Renyi, 1970]:
\begin{displaymath}
K_q = - \lim_{\Delta t \rightarrow 0} \lim_{r \rightarrow 0}...
...}
\ln \sum_{i_{0} \ldots i_{d-1}}
p^q_{i_{0} \ldots i_{d-1}}
\end{displaymath} (2.5)

The function $K_q$ is also monotonically decreasing with $q$. The $K_0$ is called the topological entropy; the $K_1$ is the metric or Kolmogorov-Sinai entropy, and the $K_2$ is called the ``correlation entropy''.

In this report we will focus on the estimation of the $K_2$. A $K_2 > 0$ is a sufficient condition for chaos since $K \geq K_2$ [Grassberger and Procaccia, 1983a].


next up previous contents
Next: The embedding theorem Up: THE CHARACTERIZATION OF CHAOTIC Previous: The generalized dimensions   Contents
webmaster@rullf2.xs4all.nl