1 From Experiments to Closed Loop Control II: The X-files Håkan Hjalmarsson Department of Signals,...

Preview:

DESCRIPTION

3 Outline Experiment design Open loop/ Closed loop Spectra Validatoin C(G) G0G0 max min Model complexity max min Computational resources max min User skill Prior information max min max min Performance specifications G0G0 Noise Experiment Inform. contents Feedback max min Exp. constraints Control design/Estimator=Controller True system acts as disturbance ”Feedback” beneficial !! Perf. specs C(G) G +  - Robust control Parameter estimation

Citation preview

1

From Experiments to

Closed Loop Control II: The X-files

Håkan HjalmarssonDepartment of Signals, Sensors and SystemsRoyal Institute of Technology

2

The Problem

Controller

3

OutlineE

xper

imen

t des

ign Open loop/

Closed loop

Spectra

Val

idat

oin

C(G) G0

max

min

Model complexity

max

min

Computational resources

max

min

User skill

Prior informationmax

minmax

min

Performance specifications

G0 Noise

Exp

erim

ent

Inform. contents

Feedback

max

minExp

. con

stra

ints

•Control design/Estimator=Controller

•True system acts as disturbance ”Feedback” beneficial !!

Perf

. spe

cs

C(G)

G +-

Robust control

Parameter estimation

4

Robust Control

•Frequency by frequency bounds on the model error usually required for robust stability and robust performance

•Trade-off performance vs model quality, e.g:

= |(G0 G -1 - I ) T(G,C)|

sufficiently small

5

Next Topic: Parameter EstimationE

xper

imen

t des

ign Open loop/

Closed loop

Spectra

Val

idat

oin

C(G) G0

max

min

Model complexity

max

min

Computational resources

max

min

User skill

Prior informationmax

minmax

min

Performance specifications

G0 Noise

Exp

erim

ent

Inform. contents

Feedback

max

minExp

. con

stra

ints

Perf

. spe

cs

C(G)

G +-

Robust control

Parameter estimation

What are the means to control the model error in parameter estimation?

6

Parameter Estimation Essentials

u yv

Model: y = G()u + H()v, 2 Rm

Prediction error: () = H()-1 (y - G()u )

G0G0= B0/A0

e0 (white noise, variance )H0=

C0/D0True system:

N

t

t1

),(minargˆ 2

Parameter estimate:

)],([ Eminarg 2*

tLimit estimate:

7

Decomposing the Model Error

Model error: E [| G0 - G |2G0 - G* |2 + E [| G * - G |

2

MSE Bias Error Variance Error

^ ^

Parameter estimation

Limit model

8

The Bias Error

) Can only tune the L2 norm of the bias error

) Cannot guarantee stability

Much effort in literature on tuning the bias – Neglects information contents in data

d

H

HH

HuGGt 2

2

122|)(|

2|)(0|2|)(|

2|)(0|)],(E[

Parameter estimation

9

Statistical Aspects of Restricted Complexity Modeling

Reduced order controller

Data + Priors

Reduced order model

Full order controllerFull order model

Model

Which way is best wrt statistical accuracy?

How should we identify a restricted complexity model such that noise impact minimized?

Example:

Parameter estimation

10

An Illustrative Example

•The noise is white, Gaussian and has unit variance

•Many parameters but

Restricted Complexity Modeling Statistical Aspects

Estimate static gain:

11

Method 1: Maximum Likelihood (= Least-Squares)

Method 2: Biased

Biased beats ML!!

Restricted Complexity Modeling Statistical Aspects

Full order model:

Only one parameter:

Variance contribution from

unmodeled dynamics

Bias error

12

But ......

Why not take the first parameter of the ML estimate as estimate?

•Same bias as biased estimate

•Lower variance – no unmodelled dynamics that contribute

ML beats biased!!

Restricted Complexity Modeling Statistical Aspects

Variance of first parameter Bias error

13

A Separation Principle

AML of ^f() AML of f()

Restricted Complexity Modeling Statistical Aspects

The invariance principle in statistics:

14

Optimal Identification for Performance

Restricted Complexity Modeling Statistical Aspects

) The minimizing G is a function of G0

1. Estimate ML model GML of G0

2. Optimal reduced order estimate:

Bonus: Stability can be checked

15

Applications:•Model reduction (Tjärnström and Ljung)

•Simulation (Zhu and van den Bosch)

•Estimation of model uncertainty

•I4C

Conclusion: Always model as well as possible before any model simplifications

Restricted Complexity Modeling Statistical Aspects

The Separation Principle

16

SummaryRestricted Complexity Modeling

Statistical Aspects

Reduced order controller

Data + PriorsFull order controllerFull order model

•For a given data set, always model as well as possible in order to ensure best possible statistical properties

Reduced order model

17

Moving Towards Real Applications•We have to accept that reality is always more complex than our models

•Bias error in general not quantifiable frequency by frequency (unless priors are introduced)

•How do we cope with this?

(and we do – there are numerous success stories)

Restricted Complexity Modeling

18

Near Optimal ModelingExample continued:

) One parameter model optimal (for estimating G(0) )

regardless of system complexity!

Restricted complexity model same accuracy as ML!

Suppose

Restricted Complexity Modeling

19

Restricted Complexity ModelingNear Optimal Modeling

20

Non-singular Case

•LS-estimation provides near optimal models if MSE is small enough

Restricted Complexity ModelingNear Optimal Modeling

Level set of 2(t,)

•All models in confidence region qualify as good models (within a factor 2)!

21

Experimental conditions can be used to:

1. Ensure near statistical optimality of restricted complexity models by making uncertainty large in certain ”directions” (Let sleeping dogs lie)

Allows the bias error to be assessed by the variance error

Restricted Complexity ModelingNear Optimal Modeling

Conclusions from Example

2. Ensure that certain system properties can be estimated accurately no matter the system complexity by making uncertainty small in certain ”directions”

22

|G±-G|^

The role of the noise modelModels with small MSE good ) Also noise model important Example: 3rd order Box-Jenkins system

Noise model useful in near optimal modeling!

2nd order OE 2nd order BJ

3rd order BJ

Restricted Complexity Modeling Statistical Aspects

Near Optimal Modeling

23

f()

Near optimal models and the separation princple

Using near optimal estimates in the separation principle leads to

near optimal estimates of !

N^ N

Restricted Complexity Modeling Statistical Aspects

Near Optimal Modeling

24

Summary• Models inside confidence region of full-order model are near optimal

• Can be obtained by least-squares identification

• The noise model is important

• The separation principle is applicable

• Experimental conditions determine which models are near optimal!

•But we need full-order model for model error quantification - The Achilles heel.

Restricted Complexity Modeling Near Optimal Modeling

25

Safe System Id

•Let’s examine the variance error!

•and then experiment design issues

How and for which system properties can this be achieved?????

Ensure that certain system properties can be estimated accurately no matter the system complexity by making uncertainty small in certain ”directions”

26

The Objectiven

o true parameters (n = dimension)

m=fn(n) quantities of interest (e.g. nmp zeros, freq resp at certain freq. ....). Dimenson m<n.

P=Covariance of n

Cov(m) = fn´(no)P[fn´(n

o)]T

Criterion: J=Trace(Cov(m))

Constraint: s u()d ·Q: When can we choose u such that J is small regardless of n?

Safe System Id

27

Some first insightSafe System Id

Cov(m) = fn´(no)P[fn´(n

o)]T

Criterion: J=Trace(Cov(m))Constraint: s u()d ·

Suppose fn´ normalized so that it is an ON-matrix

Choose u such that “smallest” eigenvectors of P , fn´

This gives J= sum of m smallest eigenvalues of P (which are related to u)

28

FIR case

• P-1= s-n n

* u d /

where n=[1 e-j ... e-j(n-1)]T

• Eigenvalue(P-1)=1/igenvalue(P)

• P-1 and P have the same eigenvectors

Asymptotic results for P-1 (Grenander Szegö):

i) eigenvalues , u(2 k/n), k=1,..n

ii) eigenvectors , n(2 k/n) (which are orthogonal)

Safe System Id

29

ExampleSafe System Id

30

Eigenvalues of P vs u

Cosine for angle between n and eigenvectors of P

Safe System Id

31

Input Design Recipe1. Choose freq for which n span fn´

2. Choose u as large as possible at these freq. bins.

Safe System Id

How to combat system complexity:

If n continues to span fn’ as n is increased, then the accuracy is insensitive to the model order

cf static gain example!

32

Illustrations

•NMP-zero estimates

•Variance of frequency function estimates

Safe System Id

33

NMP-zerosf’n=[1 zo

-1 zo-2 ....]T

•Tail elements become smaller and smaller

) Variance converges as n!1

Safe System Id

34

Zero estimates for 50 realizations

Unit circle

NMP zeroMP zero

ExampleSafe System Id

NMP-zeros

35

The Variance Error of Frequency Funtion Estimates

Generally very complicated function of

•Input spectrum u

•Noise spectrum v

•True system G0

But can always be expressed as n,N/ N v/ u

n = # estimated parameters

PGGG *20 ]|ˆGE[|

Covariance matrix of parameter estimate

Safe System Id

36

An archetypical variance expression

Expression for ???

Variance of Frequency Function Estimates

37

Large sample expression (LSE)Variance of Frequency Function Estimates

Asymptotic Expressions

•Xie & Ljung (TAC 2001): Fixed denominator + AR input

• Ninness and Hjalmarsson (TAC 2004): BJ model + AR input

38

Comparison with true variance

HOE-85

LSETRUE

Variance of Frequency Function EstimatesAsymptotic Expressions

39

Safe System Id: FIR systems

Suppose 1¼ ej1 and all other poles at origin:

AR-input: u=1/F w, n=degree of F.

•Last term dominates for ¼1: Insensitive to model order

•Last term small for 1 : Variance grows linearly with order

Safe System IdVariance of Frequency Function Estimates

40

Summary

•Focusing the input spectrum to a certain frequency region makes the model accuracy less dependent on the model complexity in this range

• Penalty at other frequency regions

• Classical m/N v/u expression toooo optimistic variance approximation around narrow peaks of the input spectrum

Safe System IdVariance of Frequency Function Estimates

41

Near Optimal ModelingSafe System Id

Suppose I practice safe system id.

Then I know that with a full order, or overparameterized, model I will get what I want.

Q: What if I use a restricted complexity model?

Well, for a near optimal model it has to be inside the confidence region of the full order model which means that it has to model the important system features accurately!

cf static gain example!

42

Summary

•Use input to reveal important system features

•and be prepared to model these

•”Standard” model uncertainty estimates valid for these features

•Let sleeping dogs lie

•Ensure that application take large model uncertainties into account for other system features (the dogs)

Make sure the data speaks what you need to hear!

Modeling Paradigm:

Safe System Id

43

Experiment Design for Safe System Id

•Robust stability

•NMP-zeros

•One impulse response coefficient

•(Static gain)

44

Experiment Design for Robust Stability

Experiment Design

Robust stability:

Confidence bounds:

PGGG *20 ]|ˆGE[| Variance:

Can be transformed into a problem that is convex in the autocovariances of the input, cf Märta’s

talk on MondaySafe System Id: Design for higher system order than you believe the system to be

45

Input Design for Estimation of NMP-zeros

Minu E u2

s.t Var zNMP· Result:

For AR-models regardless of model order use

• first order AR-input with pole = NMP-zero mirrored

• Minimum input variance =

Experiment Design

46

Optimal Design vs White NoisePower with

optimal design/Power with white noise

design

NMP zero location

Experiment DesignNMP zeros

47

Restricted Complexity Modeling

Experiment DesignNMP zeros

Recall that in the static gain example the optimal input (designed for a full order model) lead to that a simple model could be used with same accuracy.

5th order ARX-system with

1 NMP-zero z=1.2, 2 MP-zeros

5th order ARX-model with 1 zero

36 hour old results:

White input: z = -0.49

Optimal AR-1 input: z=1.17

48

First Impulse Response Coefficient

Experiment Design

•Only g1o of interest

•White noise optimal independently of system complexity.

•Variance of estimate independent of system complexity

•y(t)= u(t-1) gives consistent estimate and same variance as full order model

49

Summary• Convex reformulations

•A wide range of criteria can be handled

• There seems to be a connection between optimal designs and restricted complexity modeling

Experiment Design

50

Summary of Summaries• The Separation Principle

• Near Optimal Models

• The Fundamental Importance of Experiment Design

•Insensitivity to system complexity

•Let sleeping dogs lie

Recommended