Automatica, Vol.32, No.10, 1403-1415, 1996
On Duality of Regularized Exponential and Linear Forgetting
Regularized (stabilized) versions of exponential and linear forgetting in parameter tracking are shown to be dual to each other. Both are derived by solving basically the same Bayesian decision problem where Kullback-Leibler divergence is used to measure (quasi)distance between probability distributions of estimated parameters. The type of forgetting depends solely on the order of arguments in Kullback-Leibler divergence. This general view indicates under which conditions one technique is superior to the other. Applied to the case of ARX models, the approach results in a class of regularized or stabilized forgetting strategies that are naturally robust with respect to poor system excitation.