*** Welcome to piglix ***

Dynamical system (definition)


The dynamical system concept is a mathematical formalization for any fixed "rule" which describes the time dependence of a point's position in its ambient space. The concept unifies very different types of such "rules" in mathematics: the different choices made for how time is measured and the special properties of the ambient space may give an idea of the vastness of the class of objects described by this concept. Time can be measured by integers, by real or complex numbers or can be a more general algebraic object, losing the memory of its physical origin, and the ambient space may be simply a set, without the need of a smooth space-time structure defined on it.

There are two classes of definitions for a dynamical system: one is motivated by ordinary differential equations and is geometrical in flavor; and the other is motivated by ergodic theory and is measure theoretical in flavor. The measure theoretical definitions assume the existence of a measure-preserving transformation. This appears to exclude dissipative systems, as in a dissipative system a small region of phase space shrinks under time evolution. A simple construction (sometimes called the Krylov–Bogolyubov theorem) shows that it is always possible to construct a measure so as to make the evolution rule of the dynamical system a measure-preserving transformation. In the construction a given measure of the state space is summed for all future points of a trajectory, assuring the invariance.

The difficulty in constructing the natural measure for a dynamical system makes it difficult to develop ergodic theory starting from differential equations, so it becomes convenient to have a dynamical systems-motivated definition within ergodic theory that side-steps the choice of measure.

In the most general sense, a dynamical system is a tuple (T, M, Φ) where T is a monoid, written additively, M is a non-empty set and Φ is a function


...
Wikipedia

...