- Previous Article
- Next Article
- Table of Contents
IEEE Transactions on Automatic Control, Vol.42, No.11, 1488-1499, 1997
On Critical Stability of Discrete-Time Adaptive Nonlinear Control
In this paper, we examine the global stability and instability problems for a class of discrete-time adaptive nonlinear stochastic control, The systems to be controlled may exhibit chaotic behavior and are assumed to be linear in unknown parameters hut nonlinear in output dynamics, which are characterized by a nonlinear function [say, f(x)]. It is found and proved that in the scalar parameter case there is a critical stability phenomenon for least squares (LS)-based adaptive control systems. To be specific, let the growth rate of f(x) be f(x) = O(parallel to x parallel to(b)) with b greater than or equal to 0, then it is found that b = 4 is a critical value for global stability, i.e., the closed-loop adaptive system is globally stable if b < 4 and is unstable in general if b greater than or equal to 4. As a consequence, we find an interesting phenomenon that the linear case does not have : for some LS-based certainty equivalence adaptive controls, even if the LS parameter estimates are strongly consistent, the closed-loop systems may still be unstable, This paper also indicates that adaptive nonlinear stochastic control that is designed based on, e.g., Taylor expansion (or Weierstrass approximation) for nonlinear models, may not be feasible in general.