Master Techniques and Successfully Build Models Using a Single Resource Vital to all data-driven or measurement-based process operations, system identification is an interface that is based on observational science, and centers on developing mathematical models from observed data. Principles of System Identification: Theory and Practice is an introductory-level book that presents the basic foundations and underlying methods relevant to system identification. The overall scope of the book focuses on system identification with an emphasis on practice, and concentrates most specifically on discrete-time linear system identification. Useful for Both Theory and Practice The book presents the foundational pillars of identification, namely, the theory of discrete-time LTI systems, the basics of signal processing, the theory of random processes, and estimation theory. It explains the core theoretical concepts of building (linear) dynamic models from experimental data, as well as the experimental and practical aspects of identification. The author offers glimpses of modern developments in this area, and provides numerical and simulation-based examples, case studies, end-of-chapter problems, and other ample references to code for illustration and training. Comprising 26 chapters, and ideal for coursework and self-study, this extensive text: * Provides the essential concepts of identification * Lays down the foundations of mathematical descriptions of systems, random processes, and estimation in the context of identification * Discusses the theory pertaining to non-parametric and parametric models for deterministic-plus-stochastic LTI systems in detail * Demonstrates the concepts and methods of identification on different case-studies * Presents a gradual development of state-space identification and grey-box modeling * Offers an overview of advanced topics of identification namely the linear time-varying (LTV), non-linear, and closed-loop identification * Discusses a multivariable approach to identification using the iterative principal component analysis * Embeds MATLAB(R) codes for illustrated examples in the text at the respective points Principles of System Identification: Theory and Practice presents a formal base in LTI deterministic and stochastic systems modeling and estimation theory; it is a one-stop reference for introductory to moderately advanced courses on system identification, as well as introductory courses on stochastic signal processing or time-series analysis.
Foreword xix
Preface xxi
List of Figures xxvii
List of Tables xxxiii
Part I Introduction To Identification And
Models For Linear Deterministic Systems
Chapter 1 Introduction 2 (29)
1.1 Motivation 2 (6)
1.2 Historical developments 8 (5)
1.3 System Identification 13 (6)
1.3.1 Three Facts of Identification 14 (1)
1.3.2 Notion of a Model 15 (2)
1.3.3 Quantitative vs. Qualitative Models 17 (2)
1.3.3.1 Deterministic vs. Stochastic 18 (1)
Models
1.3.3.2 Non-Parametric vs. Parametric 18 (1)
Models
1.4 Systematic identification 19 (7)
1.4.1 Data Generation and Acquisition 19 (2)
1.4.2 Data Pre-Processing 21 (1)
1.4.3 Data Visualization 22 (1)
1.4.4 Model Development 22 (2)
1.4.5 Model Assessment and Validation 24 (1)
1.4.6 Prior Process Knowledge 24 (1)
1.4.7 Suggestions for Obtaining a Good 25 (1)
Model
1.5 Flow of learning material 26 (3)
1.6 Software 29 (2)
Chapter 2 A Journey into Identification 31 (25)
2.1 Identifiability 31 (3)
2.2 Signal-to-Noise ratio 34 (1)
2.3 Overfitting 35 (3)
2.4 A modeling example: liquid level system 38 (15)
2.4.1 The Physical Process 38 (1)
2.4.2 Data Generation 38 (2)
2.4.3 Data Visualization and Preliminary 40 (1)
Analysis
2.4.4 Building Non-Parametric Models 41 (2)
2.4.5 Building Parametric Models 43 (2)
2.4.6 Goodness of the Model 45 (5)
2.4.7 Developing a State-Space Model 50 (3)
2.5 Reflections and summary 53 (3)
Chapter 3 Mathematical Descriptions of 56 (12)
Processes: Models
3.1 Definition of a model 56 (1)
3.2 Classification of models 57 (11)
3.2.1 Types of Models 58 (8)
3.2.2 Models for Identification 66 (2)
Chapter 4 Models for Discrete-Time LTI Systems 68 (41)
4.1 Convolution model 68 (4)
4.1.1 Impulse Response 69 (3)
4.2 Response models 72 (7)
4.2.1 Finite Impulse Response (FIR) Model 72 (1)
4.2.2 Step Response Model 73 (2)
4.2.3 Frequency Response Model 75 (4)
4.3 Difference equation form 79 (7)
4.3.1 Motivating Remarks 79 (1)
4.3.2 Parametrization of Impulse Response 80 (1)
4.3.3 Transfer Function Operator 81 (3)
4.3.4 Stability and Poles 84 (2)
4.4 State-space descriptions 86 (15)
4.4.1 Background 86 (1)
4.4.2 State Variable 87 (3)
4.4.3 State-Space Models 90 (2)
4.4.3.1 Forms of State-Space 92 (6)
Representations
4.4.4 State-space H Transfer Function 98 (3)
Operator Form
4.5 Illustrative example in MATLAB: 101 (6)
estimating LTI models
4.5.1 Data Generation 101 (1)
4.5.2 Estimation of FIR Model 102 (1)
4.5.3 Estimation of Step-Response Model 102 (1)
4.5.4 Estimation of Difference Equation 103 (1)
Model
4.5.5 Estimation of a State-Space Model 104 (3)
4.6 Summary 107 (2)
Chapter 5 Transform-Domain Models for Linear 109 (20)
Time-Invariant Systems
5.1 Frequency response function 109 (3)
5.1.1 Characteristics of FRF 109 (3)
5.2 Transfer function form 112 (11)
5.2.1 Response to Damped Oscillatory 112 (1)
Signals
5.2.2 z-Transforms 113 (4)
5.2.2.1 Properties of z-Transforms 115 (2)
5.2.3 Transfer Functions 117 (12)
5.2.3.1 FRF: Special Case of Transfer 121 (2)
Function
5.3 Empirical transfer function (ETF) 123 (2)
5.4 Closure 125 (4)
Chapter 6 Sampling and Discretization 129 (22)
6.1 Discretization 129 (12)
6.1.1 Sampled-Data System 131 (1)
6.1.2 Zero-Order Hold 131 (1)
6.1.3 Sampler 132 (1)
6.1.4 State-Space Approach 133 (3)
6.1.5 Transfer Function Approach 136 (5)
6.2 Sampling 141 (6)
6.2.1 Choice of Sampling Rate 142 (2)
6.2.2 Sampling Theorem 144 (2)
6.2.3 Practical Guidelines for Sampling 146 (1)
6.3 Summary 147 (4)
Part II Models For Random Processes
Chapter 7 Random Processes 151 (35)
7.1 Introductory remarks 151 (1)
7.2 Random variables and probability 152 (1)
7.3 Probability theory 153 (5)
7.3.1 Probability Distribution Functions 154 (4)
7.4 Statistical properties of random 158 (13)
variables
7.4.1 Mean and Variance 158 (5)
7.4.2 Multivariate Case 163 (6)
7.4.2.1 Covariance and Correlation 165 (4)
7.4.3 Partial Correlation 169 (2)
7.5 Random signals and processes 171 (11)
7.5.1 Definitions 171 (2)
7.5.2 Notion of Realization 173 (2)
7.5.3 Statistical Properties 175 (1)
7.5.4 Stationarity 176 (2)
7.5.5 Non-Stationarities 178 (3)
7.5.6 Ergodicity 181 (1)
7.6 Time-series analysis 182 (2)
7.7 Summary 184 (2)
Chapter 8 Time-Domain Analysis: Correlation 186 (18)
Functions
8.1 Motivation 186 (1)
8.2 Auto-covariance function 187 (3)
8.2.1 Auto-Correlation Function (ACF) 187 (3)
8.3 White-noise process 190 (5)
8.3.1 Theoretical ACFs of Elementary 192 (3)
Processes
8.4 Cross-covariance function 195 (3)
8.4.1 Properties and Uses of CCF 196 (2)
8.5 Partial correlation functions 198 (4)
8.5.1 Partial ACF 198 (3)
8.5.2 Partial CCF 201 (1)
8.6 Summary 202 (2)
Chapter 9 Models for Linear Stationary 204 (34)
Processes
9.1 Motivation 204 (1)
9.2 Basic ideas 205 (2)
9.3 Linear stationary processes 207 (3)
9.3.1 Non-Uniqueness of Time-Series Models 209 (1)
9.4 Moving average models 210 (5)
9.4.1 ACVF Signature of an MA Process 210 (2)
9.4.2 Invertibility of an MA Process 212 (3)
9.5 Auto-regressive models 215 (11)
9.5.1 Stationary Representations 216 (1)
9.5.2 ACF of AR Processes 217 (3)
9.5.3 Order Determination and PACF 220 (2)
9.5.4 Alternative Representations of the 222 (2)
AR Process
9.5.5 Equivalence Between AR and MA 224 (2)
Representations
9.6 Auto-regressive moving average models 226 (1)
9.7 Auto-regressive integrated moving 227 (7)
average models
9.8 Summary 234 (4)
Chapter 10 Fourier Analysis and Spectral 238 (32)
Analysis of Deterministic Signals
10.1 Motivation 238 (4)
10.2 Definitions 242 (6)
10.2.1 Periodic and Aperiodic signals 242 (1)
10.2.2 Energy and Power Signals 243 (1)
10.2.3 Cross-Covariance Functions for 244 (4)
Deterministic Signals
10.3 Fourier representations of 248 (14)
deterministic processes
10.3.1 Fourier Series 249 (1)
10.3.2 Power Spectrum 250 (1)
10.3.3 Fourier Transform 251 (2)
10.3.4 Discrete-Time Fourier Series 253 (2)
10.3.5 Discrete-Time Fourier Transform 255 (3)
10.3.6 Properties of DTFT 258 (4)
10.4 Discrete Fourier Transform (DFT) 262 (5)
10.4.1 Spectrum and Spectral Density 266 (1)
10.5 Summary 267 (3)
Chapter 11 Spectral Representations of Random 270 (35)
Processes
11.1 Introduction 270 (1)
11.2 Power spectral density of a random 271 (11)
process
11.2.1 PSD from Ensemble Averaging 272 (2)
11.2.2 PSD from Auto-Covariance Function 274 (7)
11.2.2.1 Random Periodic Process 277 (4)
11.2.3 Wiener Representations and PSD 281 (1)
11.3 Spectral characteristics of standard 282 (5)
processes
11.3.1 White Noise Process 282 (1)
11.3.2 Spectral Density of ARMA Process: 283 (4)
Colored Noise
11.4 Cross-spectral density and coherence 287 (6)
11.5 Partial coherence 293 (2)
11.6 Spectral factorization 295 (6)
11.7 Summary 301 (4)
Part III Estimation Methods
Chapter 12 Introduction to Estimation 305 (12)
12.1 Motivation 305 (1)
12.2 A simple example: constant embedded in 305 (2)
noise
12.3 Definitions and terminology 307 (3)
12.3.1 Goodness of Estimators 309 (1)
12.4 Types of estimation problems 310 (4)
12.4.1 Signal Estimation 310 (2)
12.4.2 Parameter Estimation 312 (1)
12.4.3 State Estimation 313 (1)
12.4.4 Other Classifications 313 (1)
12.5 Estimation methods 314 (1)
12.6 Historical notes 315 (2)
Chapter 13 Goodness of Estimators 317 (33)
13.1 Introduction 317 (1)
13.2 Fisher information 318 (4)
13.3 Bias 322 (1)
13.4 Variance 322 (3)
13.4.1 Minimum Variance Unbiased Estimator 325 (1)
13.5 Efficiency 325 (1)
13.6 Sufficiency 326 (1)
13.7 Cramer-Rao's inequality 326 (6)
13.7.1 Best Linear Unbiased Estimator 331 (1)
13.8 Asymptotic bias 332 (1)
13.9 Mean square error 333 (1)
13.9.1 Minimum Mean-Square Estimator 334 (1)
13.10 Consistency 334 (2)
13.11 Distribution of estimates 336 (2)
13.11.1 Central Limit Theorem 337 (1)
13.12 Hypothesis testing and confidence 338 (7)
intervals
13.12.1 Hypothesis Testing 339 (3)
13.12.2 Confidence Regions 342 (3)
13.13 Empirical methods for hypothesis 345 (1)
testing
13.14 Summary 346 (1)
13.A Appendix 347 (3)
13.A.1 Proof of Cramer-Rao Inequality 347 (3)
Chapter 14 Estimation Methods: Part I 350 (50)
14.1 Introduction 350 (1)
14.2 Method of moments estimators 351 (4)
14.2.1 Basic Idea 351 (4)
14.3 Least squares estimators 355 (27)
14.3.1 Ordinary Least Squares 355 (7)
14.3.2 Goodness of LS Fits 362 (3)
14.3.3 Properties of the LS Estimator 365 (8)
14.3.4 Computing the Linear LS Estimate 373 (3)
14.3.5 Weighted Least Squares 376 (5)
14.3.6 Other Variants of Linear LS 381 (1)
14.4 Non-linear least squares 382 (11)
14.4.1 Numerical Methods for Optimization 384 (2)
14.4.2 Special Cases 386 (3)
14.4.2.1 Linear in Parameters 387 (1)
14.4.2.2 Linear via Transformation 387 (1)
14.4.2.3 Pseudo-Linear Regression 388 (1)
14.4.2.4 Algorithmic Aspects of NLS 388 (1)
Methods
14.4.3 Asymptotic Properties of the NLS 389 (4)
Estimator
14.5 Summary 393 (7)
14.A Appendix 394 (6)
14.A.1 Projection Theorem 394 (1)
14.A.2 Decomposition Theorem 394 (1)
14.A.3 QR Factorization 395 (1)
14.A.4 Singular Value Decomposition 396 (4)
Chapter 15 Estimation Methods: Part II 400 (19)
15.1 Maximum likelihood estimators 400 (11)
15.1.1 Estimation of Mean and Variance: 403 (2)
GWN
15.1.2 Estimation of an ARX Model 405 (4)
15.1.3 Computing the MLE 409 (1)
15.1.4 Properties of the ML Estimator 410 (1)
15.2 Bayesian estimators 411 (6)
15.2.1 Linear Bayesian MMSE 416 (1)
15.3 Summary 417 (2)
Chapter 16 Estimation of Signal Properties 419 (60)
16.1 Introduction 419 (1)
16.2 Estimation of mean and variance 419 (5)
16.2.1 Estimators of Mean 420 (2)
16.2.2 Estimation of Variance 422 (2)
16.3 Estimators of correlation 424 (2)
16.3.1 Estimators of Partial Correlation 425 (1)
16.4 Estimation of correlation functions 426 (7)
16.5 Estimation of auto-power Spectra 433 (33)
16.5.1 Periodogram 434 (1)
16.5.2 Finite-Length Effects in Direct 434 (4)
DFT Methods
16.5.2.1 Spectral Leakage 434 (4)
16.5.3 Remedies: Window Functions 438 (7)
16.5.4 Estimation of Spectra for 445 (1)
Stochastic Signals
16.5.5 Periodogram Estimator 445 (6)
16.5.5.1 Properties of Periodogram as a 445 (6)
PSD Estimator for Stochastic Signals
16.5.6 Averaged (Smoothed) Periodogram 451 (10)
Estimators
16.5.7 Parametric Methods 461 (3)
16.5.8 Subspace Decomposition-Based 464 (2)
Methods
16.6 Estimation of cross-spectral density 466 (2)
16.7 Estimation of coherence 468 (5)
16.8 Summary 473 (6)
Part IV Identification Of Dynamic Models -
Concepts And Principles
Chapter 17 Non-Parametric and Parametric 479 (20)
Models for Identification
17.1 Introduction 479 (1)
17.2 The overall model 479 (1)
17.3 Quasi-stationarity 480 (4)
17.4 Non-parametric descriptions 484 (2)
17.4.1 Time-Domain Descriptions 484 (2)
17.4.1.1 FIR Models 485 (1)
17.4.1.2 Step Response Models 485 (1)
17.4.2 Frequency-Domain Descriptions 486 (1)
17.5 Parametric descriptions 486 (11)
17.5.1 Equation-Error Models 488 (4)
17.5.1.1 ARX Family 488 (1)
17.5.1.2 ARMAX Family 489 (2)
17.5.1.3 ARIMAX Models 491 (1)
17.5.2 Output-Error Family 492 (2)
17.5.3 Box-Jenkins Family 494 (2)
17.5.4 Selecting an Appropriate Model 496 (1)
Structure
17.6 Summary 497 (2)
Chapter 18 Predictions 499 (21)
18.1 Introduction 499 (1)
18.2 Conditional expectation and linear 500 (5)
predictors
18.2.1 Best Linear Predictor 502 (3)
18.3 One-step ahead prediction and 505 (2)
innovations
18.3.1 Predictions of the Stochastic 505 (1)
Process
18.3.2 Predictions of the Overall LTI 506 (1)
System
18.4 Multi-step and infinite-step ahead 507 (5)
predictions
18.5 Predictor model: An alternative LTI 512 (2)
description
18.5.1 Model Sets and Structures 513 (1)
18.6 Identifiability 514 (4)
18.6.1 Model Identifiability 514 (1)
18.6.2 Identifiable LTI Black-Box 514 (3)
Structures
18.6.3 System Identifiability 517 (1)
18.7 Summary 518 (2)
Chapter 19 Identification of Parametric 520 (22)
Time-Series Models
19.1 Introduction 520 (1)
19.2 Estimation of AR models 520 (9)
19.2.1 Y-W Method 521 (3)
19.2.2 Least Squares/Covariance Method 524 (1)
19.2.3 Modified Covariance Method 525 (1)
19.2.4 Burg's Method 526 (3)
19.2.5 ML Estimator 529 (1)
19.3 Estimation of MA models 529 (2)
19.4 Estimation of ARMA models 531 (8)
19.4.1 Non-linear LS Estimation 531 (2)
19.4.2 Maximum Likelihood Estimation 533 (3)
19.4.3 Properties of the NLS and ML 536 (3)
estimators
19.4.4 Estimation of ARIMA Models 539 (1)
19.5 Summary 539 (3)
Chapter 20 Identification of Non-Parametric 542 (26)
Input-Output Models
20.1 Recap 542 (1)
20.2 Impulse response estimation 542 (11)
20.2.1 Direct Estimation using Impulse 543 (1)
Inputs
20.2.2 Estimation from Response to 543 (5)
Arbitrary inputs
20.2.2.1 Diagonalization: Pre-Whitening 545 (3)
the Input
20.2.3 Regularization and Including Prior 548 (4)
Knowledge
20.2.4 Estimation of IR Coefficients from 552 (1)
Frequency Response Data
20.2.5 Indirect Estimation from 552 (1)
Parametric Models
20.3 Step response estimation 553 (1)
20.4 Estimation of frequency response 554 (9)
function
20.4.1 Sinusoidal Input-Based Estimation 554 (2)
20.4.2 ETFE 556 (2)
20.4.3 Estimation from Spectral 558 (2)
Densities: Spectral Analysis (SPA)
20.4.4 Smoothed Estimates 560 (13)
20.4.4.1 Smoothing the ETFE 561 (1)
20.4.4.2 From Smoothed PSD Estimates 561 (2)
20.4.4.3 Welch's Averaged Approach 563 (1)
20.5 Estimating the disturbance spectrum 563 (2)
20.6 Summary 565 (3)
Chapter 21 Identification of Parametric 568 (43)
Input-Output Models
21.1 Recap 568 (1)
21.2 Prediction-error minimization (PEM) 569 (4)
methods
21.3 Properties of the PEM estimator 573 (8)
21.3.1 Consistency of PEM Estimators 575 (6)
21.4 Variance and distribution of PEM-QC 581 (4)
estimators
21.5 Accuracy of parametrized FRF estimates 585 (4)
using PEM
21.6 Algorithms for estimating specific 589 (14)
parametric models
21.6.1 Estimating ARX Models 590 (3)
21.6.1.1 AUDI: Estimating Several ARX 591 (2)
Models Simultaneously
21.6.2 Estimating ARMAX Models 593 (4)
21.6.2.1 Pseudo-Linear Regression 596 (1)
Method for ARMAX
21.6.3 Estimating OE Models 597 (4)
21.6.3.1 Stieglitz-McBride Algorithm 599 (2)
21.6.4 Estimating BJ Models 601 (2)
21.7 Correlation methods 603 (5)
21.7.1 Instrumental Variable (IV) Methods 604 (2)
21.7.2 Properties of the IV Estimator 606 (1)
21.7.3 Multistage IV (IV4) Method 607 (1)
21.8 Summary 608 (3)
Chapter 22 Statistical and Practical Elements 611 (45)
of Model Building
22.1 Introduction 611 (1)
22.2 Informative Data 612 (2)
22.2.1 Persistent Excitation 613 (1)
22.3 Input design for identification 614 (4)
22.3.1 Pseudo-Random Binary Sequences 615 (3)
22.3.2 Preliminary Tests for Input Design 618 (1)
22.4 Data pre-processing 618 (17)
22.4.1 Offsets, Drifts and Trends 619 (2)
22.4.2 Outliers and Missing Data 621 (12)
22.4.3 Pre-Filtering 633 (2)
22.4.4 Partitioning the Data 635 (1)
22.5 Time-delay estimation 635 (1)
22.5.1 Definitions 635 (7)
22.5.2 Impulse Response Estimation Method 636 (1)
22.5.3 Frequency-Domain Estimation Method 637 (4)
22.5.4 Model-based Estimation Method 641 (1)
22.6 Model development 642 (11)
22.6.1 Model Structure Selection 643 (1)
22.6.2 Options in Parametric Modeling 644 (2)
22.6.3 Order Determination 646 (2)
22.6.4 Model Quality Assessment and 648 (5)
Validation
22.7 Summary 653 (3)
Chapter 23 Identification of State-Space 656 (62)
Models
23.1 Introduction 656 (4)
23.2 Mathematical essentials and basic ideas 660 (3)
23.2.1 Basic Approach 661 (1)
23.2.2 Observability and Controllability 661 (2)
23.3 Kalman filter 663 (11)
23.3.1 Extended Kalman Filter and the 671 (2)
Unscented KF
23.3.2 Innovations Form 673 (1)
23.4 Foundations for subspace identification 674 (7)
23.4.1 Extended Observability Matrix 675 (1)
23.4.2 Realization Methods 676 (5)
23.4.2.1 Estimation from IR: Ho and 676 (3)
Kalman Method
23.4.2.2 Kung's Method 679 (2)
23.5 Preliminaries for subspace 681 (4)
identification methods
23.5.1 Subspaces, Projections and 682 (3)
Implementations
23.6 Subspace identification algorithms 685 (23)
23.6.1 Deterministic Systems 685 (8)
23.6.1.1 MOESP Method 687 (3)
23.6.1.2 N4SID Method 690 (3)
23.6.2 Deterministic-plus-Stochastic 693 (15)
Systems
23.6.2.1 Numerical Kalman State 694 (1)
Estimates
23.6.2.2 Statistical Interpretations of 695 (2)
Projections
23.6.2.3 MOESP and N4SID Methods for 697 (3)
the Full Case
23.6.2.4 CVA Method 700 (1)
23.6.2.5 Unified Algorithm 701 (1)
23.6.2.6 Estimation of System Matrices, 702 (4)
Noise Covariance and Kalman Gain
23.6.2.7 Interpreting SSID Methods in 706 (2)
the PE Framework
23.7 Structured state-space models 708 (7)
23.7.1 Parametrized Linear Black-Box 709 (3)
Models
23.7.2 Grey-Box Identification 712 (1)
23.7.2.1 Grey-Box Modeling of a Two-Tank 712 (3)
System
23.8 Summary 715 (3)
Chapter 24 Case Studies 718 (37)
24.1 ARIMA model of industrial dryer 718 (5)
temperature
24.1.1 Process Data 718 (1)
24.1.2 Building the Time-Series Model 718 (5)
24.2 Simulated process: developing an 723 (10)
input-output model
24.2.1 Data Generation 724 (1)
24.2.2 Data Pre-Processing 724 (1)
24.2.3 Non-Parametric Analysis 725 (2)
24.2.4 Parametric Model Development 727 (2)
24.2.5 Model Quality Assessment 729 (2)
24.2.6 Parametric Model along the OE Route 731 (2)
24.3 Process with random walk noise 733 (7)
24.3.1 Visual Analysis 733 (1)
24.3.2 Non-Parametric Estimates 733 (2)
24.3.3 Parametric Input-Output Model 735 (5)
24.4 Multivariable modeling of a four-tank 740 (12)
system
24.4.1 Process Description 740 (2)
24.4.2 Data Acquisition 742 (1)
24.4.3 Data Pre-Processing and 742 (1)
Non-Parametric Analysis
24.4.4 Development of a State-Space Model 743 (4)
24.4.5 Transfer Function Models for the 747 (8)
MIMO system
24.4.5.1 Approach I: Using the Full SS 747 (1)
Model
24.4.5.2 Approach II: Using the MOESP 748 (4)
Model
24.5 Summary 752 (3)
Part V Advanced Concepts
Chapter 25 Advanced Topics in SISO 755 (35)
Identification
25.1 Identification of linear time-varying 755 (21)
systems
25.1.1 WLS Methods with Forgetting Factor 757 (1)
25.1.2 Recursive Methods 757 (4)
25.1.3 Recursive Weighted Least Squares 761 (3)
25.1.4 Recursive PEM Algorithm 764 (2)
25.1.5 Wavelet-Based Approaches 766 (10)
25.1.5.1 Wavelet Transforms 767 (7)
25.1.5.2 Identification of LTV Systems 774 (2)
Using Wavelets
25.2 Non-linear identification 776 (7)
25.2.1 Neural Network Models 778 (1)
25.2.2 Fuzzy Models 779 (1)
25.2.3 Dynamic Non-Linear Models: NARX 780 (1)
25.2.4 Simplified Non-linear Models 780 (3)
25.2.4.1 Volterra Models 781 (1)
25.2.4.2 Hammerstein and Wiener Models 781 (2)
25.3 Closed-loop identification 783 (4)
25.3.1 Closed-Loop Identification 785 (2)
Techniques
25.4 Summary 787 (3)
Chapter 26 Linear Multivariable Identification 790 (24)
26.1 Motivation 790 (1)
26.2 Estimation of time delays in MIMO 791 (4)
systems
26.3 Principal component analysis (PCA) 795 (17)
26.3.1 Motivating Example: Linear Algebra 795 (5)
Perspective
26.3.2 Statistical Approach 800 (4)
26.3.2.1 Population Version 800 (2)
26.3.2.2 Sample Version Formulation of 802 (2)
PCA
26.3.3 Rank Determination and Modeling 804 (8)
using Iterative PCA
26.3.3.1 Iterative PCA 804 (4)
26.3.3.2 Example 1: Flow Mixing 808 (2)
26.3.3.3 Example 2: Continuously 810 (2)
Stirred Tank Heater
26.4 Summary 812 (2)
References 814 (15)
Index 829