78812 - Discrete Time Systems Identification And Control M

Academic Year 2018/2019

  • Teaching Mode: Traditional lectures
  • Campus: Bologna
  • Corso: Second cycle degree programme (LM) in Automation Engineering (cod. 8891)

Learning outcomes

The course aims to introduce the main techniques for identifying discrete time systems with particular reference to the family of equation errors models used for prediction and control. The main arguments that are presented in the course are stochastic optimal estimation, Kalman prediction and filtering in the discrete time setting and advanced digital control schemes. At the end of the course students are able to run basic identification algorithms for linear systems and to master design and implementation aspects of digital control systems.

Course contents

Introduction
Systems and models. Mathematical models. Classification of models by modeling objectives. Physical modeling and system identification. Identification steps.

Brief review of stochastic processes
Random (stochastic) processes. First and second order moments: mean, variance, autocorrelation, autocovariance. Stationary and weakly stationary processes. Gaussian processes. Ergodic processes. Sample estimates of first and second order moments. White noise. Cross-correlation and cross-covariance of two stochastic processes. Independence, uncorrelatedness and orthogonality. Vector stochastic processes and their first and second order moments. Spectral density.

Stochastic models
Modeling disturbances by filtering white noise: ARMA processes, AR processes, MA processes. System representation by means of backward and forward shift operators. Equation error models: ARX, ARARX, ARMAX, ARARMAX. FIR models as approximations of impulse responses. Time series models: AR and ARMA models. Output error models and Box-Jenkins models.

The identification problem
Definition of the identification problem. Parameter estimation and model order estimation. Identifiability and the concept of the true model. Estimator properties: unbiasdness, asymptotically unbiasdness, consistency, efficiency. Covariance of the estimate and its use as performance index.

The least squares
Introduction to the least squares (LS) method: the linear regression form. Derivation of the LS estimate. Identifiability conditions. Geometrical interpretation of the LS estimate. Derivation of the LS estimate in the geometrical framework: pseudoinverse of a matrix. Statistical properties of the LS estimator in the static case. Covariance of the estimate and its use as performance index. Weighted least squares. The best linear unbiased estimator.
Least squares identification of dynamic equation error models. Hankel matrices. LS identification of FIR models and its statistical properties. Asymptotic properties. Quasi-stationary deterministic signals. LS identification of ARX models. Consistency of the estimate. Statistical properties of the LS estimate of ARX models: consistency, asymptotic distribution, covariance of the estimate. ARX optimal predictor. LS identification of autoregressive models.
Identifiability properties of LS estimates: persistency of excitation (PE) of input signals. Persistency of excitation of input signals. PE properties of some possible input signals: white noise, step function, impulse function, ARMA signals, sum of sinusoids, pseudo random binary sequence. Identifiability conditions for FIR, ARX and AR models.
Recursive least squares (RLS) identification. RLS algorithms: standard form, covariance form, standard form with inverse matrix updating, covariance form with inverse matrix updating. Choice of the initial values. Tracking parameter variations: weigthed least squares and the forgetting factor. Recursive weighted least squares. Choice of the forgetting factor. Asymptotic behavior of RLS algorithms.

Model order estimation and model validation
The chi-square distribution and its properties. Statistical hypothesis testing. Errors of type I and II. Example of statistical hypothesis test. Asymptotic properties of the residual of the least squares estimation. Model order estimation methods: F-test, final prediction error criterion. Criteria with complexity terms: Akaike information criterion, minimum description length criterion. Model validation: whiteness tests on the LS residual sequence, tests of uncorrelation between input and residual, cross validation.

The prediction error method
Inconsistency of the LS estimate for ARARX, ARMAX and ARMA models. The prediction error method. Optimal ARMAX and ARMA predictors. Introduction to the Newton-Raphson algorithm. PEM identification of ARMAX and ARMA models by simplifying the Newton-Raphson algorithm: the Gauss-Newton algorithm. Evaluation of the gradient of the residual. Choice of the initial estimate. Statistical properties of the PEM estimator. Pseudolinear regression: identification of ARMAX and ARMA models via extended least squares. Applying PEM to equation error models, output error models and Box-Jenkins models.

The instrumental variable method
Introduction to the instrumental variable (IV) method. Identification of ARMAX models by using the IV method. Consistency conditions. Choice of instruments. Identification of ARMA models by means of the IV method. Yule-Walker equations for ARMA and AR models. Statistical properties of the IV estimator. Brief notes on extended IV methods. Identification of ARARX models by using the IV method and the LS estimation of AR models. Recursive IV algorithms. Identification of MA models. Approximation of MA models with high-order AR models. Identification of the moving average part of ARMAX and ARMA models.

Maximum likelihood
Introduction to maximum likelihood estimation. Maximum likelihood identification. The gaussian case: equivalence between ML identification and PEM (or LS) identification. Covariance of the estimate: the Cramer-Rao lower bound. Example: application of the Cramer-Rao lower bound to the LS identification of static models.

Optimal filtering and prediction of stochastic signals
Optimal k-step ahead predictors for ARMAX and ARMA models. The fundamental theorem of estimation theory: optimal estimator and optimal linear estimator. Properties of the optimal (linear) estimator. Optimal estimation of signals: the innovation sequence and its properties.
Brief recalls on the Luenberger observer. Stochastic state space models. Kalman filtering: standard assumptions. Derivation of the Kalman filter equations by means of the properties of the optimal linear estimator. The predictor-corrector form. The Kalman predictor and the difference Riccati equation. Convergence of the difference Riccati equation. The steady-state (suboptimal) Kalman predictor. Some extension of the standard Kalman filter: nonzero mean noises, time-varying models. Dealing with colored noises: state space representations of ARMA, AR and MA models and their use in the augmented state space model of the plant.

Introduction to optimal LQG control

Readings/Bibliography

R. Guidorzi, Multivariable System Identification: From Observations to Models. Bononia University Press, Bologna, 2003.

T. Söderström and P. Stoica, System Identification. Prentice Hall, Englewood Cliffs, N.J., 1987.

L. Ljung, System Identification: Theory for the User. Prentice Hall, Englewood Cliffs, N.J., 1987.

B. D. O. Anderson and J. B. Moore, "Optimal filtering", Prentice Hall, Englewood Cliffs, N.J., 1979.

S. Bittanti, "Identificazione dei modelli e sistemi adattativi", Pitagora Editrice Bologna, 2005 (in italian).

S. Bittanti, "Teoria della predizione e del filtraggio", Pitagora Editrice Bologna, 2005 (in italian).

Teaching methods

Traditional lectures.

Assessment methods

The final evaluation is based on a written exam consisting in four theoretical questions.

During the written test no books, notes, laptops and similar can be used.

The evaluation results will be published on Almaesami.

The minimum mark to pass the exam is 18/30. If your mark is lower than 18, your evaluation on Almaesami will be “Fail”.

Teaching tools

Video projector, blackboard

Links to further information

http://sting.deis.unibo.it/sting/Members/diversi/System_Identification.htm

Office hours

See the website of Roberto Diversi