Stability of Linear Control Systems
1.1 Definitions of Stability
In the Web Dictionary of Cybernetics and Systems (by F. Heylighen) there are some
definitions for the term ‘stability’ of a control system as follows:
1. Stability: The tendency of the variables or components of a system to remain within defined
and recognizable limits despite the impact of disturbances. (Young, p. 109)
2. Expanded or global stability: The ability of a system to persist and to remain qualitatively
unchanged in response either to a disturbance or to fluctuations of the system caused by a
disturbance. This idea of stability combines the concepts of traditional stability and Holling's
new concept of resilience. (Holling)
3. Stability: The capacity of an object or system to return to equilibrium after having been
displaced. Note with two possible kinds of equilibrium one may have a static (linear) stability
of rest or a dynamic (non-linear) stability of an endlessly repeated motion. (Iberall)
4. Stability: System is stable if, when perturbed, it returns to its original state. The more
quickly it returns, the more stable it is.
1.2 Concepts of Stability
Stability is probably the most important consideration when designing control systems. One
of the most important characteristics of a control system is that the output must follow the
desired signal as exactly as possible. The stability of a system is determined by the form of
the response to any input or disturbance. Absolute stability refers to whether a system is stable
or unstable. In a stable system the response to an input will arrive at and maintain some useful
value. In an unstable system the output will not settle at the desired value and may oscillate or
increase towards some high value or a physical limitation. For example, a system is stable if
the response to an impulse input approaches zero as time approaches infinity.
In the context of linear control systems, stability would be intuitively reasonable to define a
linear system to be stable if its output is bound for every bounded input, namely bounded
input bounded output (BIBO) stability. The system is said BIBO stable. This definition states
that stability is dependent on the system itself. The properties or dynamic behaviours of a
system are characterised by its transfer function G(s).
The input to a system will not affect or determine its stability. The components of the system
provide its characteristics, and hence determine stability. The solution to the differential
equation describing a system is made up of two terms, a transient response and a steady-state
response. For stability the transient response terms must all die away as time progresses. The
coefficients of the exponential terms must therefore be negative real numbers or complex
numbers with negative real parts.
The stability or instability of a closed-loop control system is determined by the poles of its
transfer function. The system is stable if the function response y(t) remains bounded as the
time (t) tends to infinity. For most control engineering purposes an even stronger concept of
stability if required: 2
Asymptotically stable: A system is said to be asymptotically stable if its response decays to
zero as t tends to infinity.
Marginally stable: as mentioned in previous section an undamped second order system has its
response that is indefinitely oscillatory. This is an example of a system that is stable but not
asymptotically stable. In fact, this system is known as marginally stable. Otherwise, the
system is said unstable.
For analysis and design purposes stability can be classified in absolute stability and relative
stability. As stated above, absolute stability refers to the condition of whether the system is
stable or unstable; it is a yes or no answer. Once the system is found to be stable, it is of
interest to determine how stable it is, and this degree of stability is a measure of relative stability.
No comments:
Post a Comment