Thursday 6 January 2011

Statistical and adoptive signal processings

Statistical
and Adaptive
Signal Processing
Spectral Estimation, Signal Modeling, Adaptive
Filtering, and Array Processing
Dimitris G. Manolakis
Massachusetts Institute of Technology
Lincoln Laboratory
Vinay K. Ingle
Northeastern University
Stephen M. Kogon
Massachusetts Institute of Technology
Lincoln Laboratory
a rtechhouse.com



CONTENTS
Preface x
vii
1 Introduction
1
1.1
Random Signals 1
1.2
Spectral Estimation 8
1.3
Signal Modeling 11
1.3.1 Rational or Pole-Zero
Models / 1.3.2 Fractional
Pole-Zero Models and
Fractal Models
1.4
Adaptive Filtering 16
1.4.1 Applications of Adaptive
Filters / 1.4.2 Features of
Adaptive Filters
1.5
Array Processing 25
1.5.1 Spatial Filtering or
Beamforming / 1.5.2 Adaptive
Interference Mitigation in
Radar Systems / 1.5.3 Adaptive
Sidelobe Canceler
1.6
Organization of the Book 29
2 Fundamentals of Discrete-
Time Signal Processing
33
2.1
Discrete-Time Signals 33
2.1.1 Continuous-Time, Discrete-
Time, and Digital Signals /
2.1.2 Mathematical Description of
Signals / 2.1.3 Real-World Signals
2.2
Representation of
Deterministic Signals
Transform-Domain37
2.2.1 Fourier Transforms and
Fourier Series / 2.2.2 Sampling
of Continuous-Time Signals /
2.2.3 The Discrete Fourier
Transform / 2.2.4 The
z
of Narrowband Signals
-Transform / 2.2.5 Representation
2.3
Discrete-Time Systems 47
2.3.1 Analysis of Linear,
Time-Invariant Systems / 2.3.2
Response to Periodic Inputs / 2.3.3
Correlation Analysis and Spectral
Density
2.4
System Invertibility
Minimum-Phase and54
2.4.1 System Invertibility and
Minimum-Phase Systems /
2.4.2 All-Pass Systems / 2.4.3
Minimum-Phase and All-Pass
Decomposition / 2.4.4 Spectral
Factorization
2.5
Lattice Filter Realizations 64
2.5.1 All-Zero Lattice Structures /
2.5.2 All-Pole Lattice Structures
2.6
Summary 70
Problems
70
3 Random Variables, Vectors,
and Sequences
75
3.1
Random Variables 75
3.1.1 Distribution and Density
Functions / 3.1.2 Statistical
Averages / 3.1.3 Some Useful
Random Variables
3.2
Random Vectors 83
3.2.1 Definitions and Second-Order
Moments / 3.2.2 Linear
Transformations of Random
Vectors / 3.2.3 Normal Random
Vectors / 3.2.4 Sums of Independent
Random Variables
3.3
Processes
Discrete-Time Stochastic97
3.3.1 Description Using
Probability Functions / 3.3.2
Second-Order Statistical
Description / 3.3.3 Stationarity /
x
xi
Contents
3.3.4 Ergodicity / 3.3.5 Random
Signal Variability / 3.3.6
Frequency-Domain Description
of Stationary Processes
3.4
Stationary Random Inputs 115
Linear Systems with
3.4.1 Time-Domain Analysis /
3.4.2 Frequency-Domain Analysis /
3.4.3 Random Signal Memory /
3.4.4 General Correlation
Matrices / 3.4.5 Correlation
Matrices from Random Processes
3.5
Representation
Whitening and Innovations125
3.5.1 Transformations Using
Eigen-decomposition / 3.5.2
Transformations Using
Triangular Decomposition /
3.5.3 The Discrete Karhunen-
Loève Transform
3.6
Theory
Principles of Estimation133
3.6.1 Properties of Estimators /
3.6.2 Estimation of Mean /
3.6.3 Estimation of Variance
3.7
Summary 142
Problems
143
4 Linear Signal Models
149
4.1
Introduction 149
4.1.1 Linear Nonparametric
Signal Models / 4.1.2 Parametric
Pole-Zero Signal Models / 4.1.3
Mixed Processes and the Wold
Decomposition
4.2
All-Pole Models 156
4.2.1 Model Properties /
4.2.2 All-Pole Modeling and
Linear Prediction / 4.2.3
Autoregressive Models
Lower-Order Models
/ 4.2.4
4.3
All-Zero Models 172
4.3.1 Model Properties / 4.3.2
Moving-Average Models / 4.3.3
Lower-Order Models
4.4
Pole-Zero Models 177
4.4.1 Model Properties / 4.4.2
Autoregressive Moving-Average
Models / 4.4.3 The First-Order
Pole-Zero Model 1: PZ (1,1) /
4.4.4 Summary and Dualities
4.5
on the Unit Circle
Models with Poles182
4.6
Models
Cepstrum of Pole-Zero184
4.6.1 Pole-Zero Models / 4.6.2
All-Pole Models / 4.6.3 All-Zero
Models
4.7
Summary 189
Problems
189
5 Nonparametric Power
Spectrum Estimation
195
5.1
Deterministic Signals
Spectral Analysis of196
5.1.1 Effect of Signal Sampling /
5.1.2 Windowing, Periodic
Extension, and Extrapolation /
5.1.3 Effect of Spectrum
Sampling / 5.1.4 Effects of
Windowing: Leakage and Loss
of Resolution / 5.1.5 Summary
5.2
Autocorrelation of
Stationary Random Signals
Estimation of the209
5.3
Spectrum of Stationary
Random Signals
Estimation of the Power212
5.3.1 Power Spectrum Estimation
Using the Periodogram / 5.3.2
Power Spectrum Estimation by
Smoothing a Single Periodogram—
The Blackman-Tukey Method /
5.3.3 Power Spectrum Estimation
by Averaging Multiple
Periodograms—The Welch-
Bartlett Method / 5.3.4 Some
Practical Considerations and
Examples
xii
Contents
5.4
Joint Signal Analysis 237
5.4.1 Estimation of Cross-Power
Spectrum / 5.4.2 Estimation of
Frequency Response Functions
5.5
Spectrum Estimation
Multitaper Power246
5.5.1 Estimation of Auto Power
Spectrum / 5.5.2 Estimation
of Cross Power Spectrum
5.6
Summary 254
Problems
255
6 Optimum Linear Filters
261
6.1
Estimation
Optimum Signal261
6.2
Error Estimation
Linear Mean Square264
6.2.1 Error Performance Surface /
6.2.2 Derivation of the Linear
MMSE Estimator / 6.2.3 Principal-
Component Analysis of the Optimum
Linear Estimator / 6.2.4 Geometric
Interpretations and the Principle of
Orthogonality / 6.2.5 Summary
and Further Properties
6.3
Equations
Solution of the Normal274
6.4
Response Filters
Optimum Finite Impulse278
6.4.1 Design and Properties /
6.4.2 Optimum FIR Filters for
Stationary Processes / 6.4.3
Frequency-Domain Interpretations
6.5
Linear Prediction 286
6.5.1 Linear Signal Estimation /
6.5.2 Forward Linear Prediction /
6.5.3 Backward Linear Prediction /
6.5.4 Stationary Processes /
6.5.5 Properties
6.6
Response Filters
Optimum Infinite Impulse295
6.6.1 Noncausal IIR Filters /
6.6.2 Causal IIR Filters / 6.6.3
Filtering of Additive Noise / 6.6.4
Linear Prediction Using the
Infinite Past—Whitening
6.7
and Deconvolution
Inverse Filtering306
6.8
Transmission Systems
Channel Equalization in Data310
6.8.1 Nyquist’s Criterion for Zero
ISI / 6.8.2 Equivalent Discrete-Time
Channel Model / 6.8.3 Linear
Equalizers / 6.8.4 Zero-Forcing
Equalizers / 6.8.5 Minimum MSE
Equalizers
6.9
Eigenfilters
Matched Filters and319
6.9.1 Deterministic Signal in Noise /
6.9.2 Random Signal in Noise
6.10
Summary 325
Problems
325
7 Algorithms and Structures
for Optimum Linear Filters
333
7.1
Recursive Algorithms 334
Fundamentals of Order-
7.1.1 Matrix Partitioning and
Optimum Nesting / 7.1.2 Inversion
of Partitioned Hermitian Matrices /
7.1.3 Levinson Recursion for the
Optimum Estimator / 7.1.4 Order-
Recursive Computation of the LDL
H
Decomposition / 7.1.5 Order-
Recursive Computation of the
Optimum Estimate
7.2
Algorithmic Quantities
Interpretations of343
7.2.1 Innovations and Backward
Prediction / 7.2.2 Partial
Correlation / 7.2.3 Order
Decomposition of the Optimum
Estimate / 7.2.4 Gram-Schmidt
Orthogonalization
7.3
for Optimum FIR Filters
Order-Recursive Algorithms347
7.3.1 Order-Recursive Computation
of the Optimum Filter / 7.3.2
xiii
Contents
Lattice-Ladder Structure / 7.3.3
Simplifications for Stationary
Stochastic Processes / 7.3.4
Algorithms Based on the UDU
H
Decomposition
7.4
and Levinson-Durbin
Algorithms of Levinson355
7.5
Optimum FIR Filters
and Predictors 361
Lattice Structures for
7.5.1 Lattice-Ladder Structures /
7.5.2 Some Properties and
Interpretations / 7.5.3 Parameter
Conversions
7.6
Algorithm of Schür 368
7.6.1 Direct Schür Algorithm /
7.6.2 Implementation
Considerations / 7.6.3 Inverse
Schür Algorithm
7.7
of Toeplitz Matrices
Triangularization and Inversion374
7.7.1 LDL
Inverse of a Toeplitz Matrix /
7.7.2 LDL
Toeplitz Matrix / 7.7.3 Inversion
of Real Toeplitz Matrices
H Decomposition ofH Decomposition of a
7.8
Kalman Filter Algorithm 378
7.8.1 Preliminary Development /
7.8.2 Development of Kalman Filter
7.9
Summary 387
Problems
389
8 Least-Squares Filtering
and Prediction
395
8.1
Squares
The Principle of Least395
8.2
Error Estimation
Linear Least-Squares396
8.2.1 Derivation of the Normal
Equations / 8.2.2 Statistical
Properties of Least-Squares
Estimators
8.3
Least-Squares FIR Filters 406
8.4
Signal Estimation
Linear Least-Squares411
8.4.1 Signal Estimation and Linear
Prediction / 8.4.2 Combined
Forward and Backward Linear
Prediction (FBLP) / 8.4.3
Narrowband Interference
Cancelation
8.5
Normal Equations 416
LS Computations Using the
8.5.1 Linear LSE Estimation /
8.5.2 LSE FIR Filtering and
Prediction
8.6
Orthogonalization
Techniques
LS Computations Using422
8.6.1 Householder Reflections /
8.6.2 The Givens Rotations / 8.6.3
Gram-Schmidt Orthogonalization
8.7
the Singular Value
Decomposition
LS Computations Using431
8.7.1 Singular Value
Decomposition / 8.7.2 Solution
of the LS Problem / 8.7.3
Rank-Deficient LS Problems
8.8
Summary 438
Problems
439
9 Signal Modeling
and Parametric
Spectral Estimation
445
9.1
Theory and Practice
The Modeling Process:445
9.2
Models
Estimation of All-Pole449
9.2.1 Direct Structures /
9.2.2 Lattice Structures / 9.2.3
Maximum Entropy Method / 9.2.4
Excitations with Line Spectra
9.3
Models
Estimation of Pole-Zero462
9.3.1 Known Excitation / 9.3.2
Unknown Excitation / 9.3.3
xiv
Contents
Nonlinear Least-Squares
Optimization
9.4
Applicatons 467
9.4.1 Spectral Estimation /
9.4.2 Speech Modeling
9.5
Spectrum Estimation
Minimum-Variance471
9.6
Frequency Estimation
Techniques
Harmonic Models and478
9.6.1 Harmonic Model /
9.6.2 Pisarenko Harmonic
Decomposition / 9.6.3 MUSIC
Algorithm / 9.6.4 Minimum-Norm
Method / 9.6.5 ESPRIT Algorithm
9.7
Summary 493
Problems
494
10 Adaptive Filters
499
10.1
Adaptive Filters
Typical Applications of500
10.1.1 Echo Cancelation in
Communications / 10.1.2
Equalization of Data
Communications Channels /
10.1.3 Linear Predictive Coding /
10.1.4 Noise Cancelation
10.2
Adaptive Filters
Principles of506
10.2.1 Features of Adaptive
Filters / 10.2.2 Optimum versus
Adaptive Filters / 10.2.3 Stability
and Steady-State Performance of
Adaptive Filters / 10.2.4 Some
Practical Considerations
10.3
Steepest Descent
Method of516
10.4
Adaptive Filters
Least-Mean-Square524
10.4.1 Derivation / 10.4.2
Adaptation in a Stationary SOE /
10.4.3 Summary and Design
Guidelines / 10.4.4 Applications
of the LMS Algorithm / 10.4.5
Some Practical Considerations
10.5
Adaptive Filters
Recursive Least-Squares548
10.5.1 LS Adaptive Filters /
10.5.2 Conventional Recursive
Least-Squares Algorithm / 10.5.3
Some Practical Considerations /
10.5.4 Convergence and
Performance Analysis
10.6
for Array Processing
RLS Algorithms560
10.6.1 LS Computations Using
the Cholesky and QR
Decompositions / 10.6.2 Two
Useful Lemmas / 10.6.3 The
QR-RLS Algorithm /
10.6.4 Extended QR-RLS
Algorithm / 10.6.5 The Inverse
QR
Implementation of QR-RLS
Algorithm Using the Givens
Rotations / 10.6.7 Implementation
of Inverse QR-RLS Algorithm
Using the Givens Rotations /
10.6.8 Classification of RLS
Algorithms for Array Processing
-RLS Algorithm / 10.6.6
10.7
for FIR Filtering
Fast RLS Algorithms573
10.7.1 Fast Fixed-Order RLS FIR
Filters / 10.7.2 RLS Lattice-
Ladder Filters / 10.7.3 RLS
Lattice-Ladder Filters Using Error
Feedback Updatings / 10.7.4
Givens Rotation–Based LS Lattice-
Ladder Algorithms / 10.7.5
Classification of RLS Algorithms
for FIR Filtering
10.8
of Adaptive Algorithms
Tracking Performance590
10.8.1 Approaches for
Nonstationary SOE / 10.8.2
Preliminaries in Performance
Analysis / 10.8.3 The LMS
Algorithm / 10.8.4 The RLS
Algorithm with Exponential
Forgetting / 10.8.5 Comparison
of Tracking Performance
10.9
Summary 607
Problems
608
xv
Contents
11 Array Processing
621
11.1
Array Fundamentals 622
11.1.1 Spatial Signals / 11.1.2
Modulation-Demodulation /
11.1.3 Array Signal Model /
11.1.4 The Sensor Array: Spatial
Sampling
11.2
Filtering: Beamforming
Conventional Spatial631
11.2.1 Spatial Matched Filter /
11.2.2 Tapered Beamforming
11.3
Processing
Optimum Array641
11.3.1 Optimum Beamforming /
11.3.2 Eigenanalysis of the
Optimum Beamformer / 11.3.3
Interference Cancelation
Performance / 11.3.4 Tapered
Optimum Beamforming / 11.3.5
The Generalized Sidelobe Canceler
11.4
Considerations for
Optimum Beamformers
Performance652
11.4.1 Effect of Signal Mismatch /
11.4.2 Effect of Bandwidth
11.5
Adaptive Beamforming 659
11.5.1 Sample Matrix Inversion /
11.5.2 Diagonal Loading with the
SMI Beamformer / 11.5.3
Implementation of the SMI
Beamformer / 11.5.4 Sample-by-
Sample Adaptive Methods
11.6
Processing Methods
Other Adaptive Array671
11.6.1 Linearly Constrained
Minimum-Variance Beamformers /
11.6.2 Partially Adaptive Arrays /
11.6.3 Sidelobe Cancelers
1 1.7
Angle Estimation 678
11.7.1 Maximum-Likelihood
Angle Estimation / 11.7.2
Cramér-Rao Lower Bound on
Angle Accuracy / 11.7.3
Beamsplitting Algorithms /
11.7.4 Model-Based Methods
11.8
Adaptive Processing
Space-Time683
11.9
Summary 685
Problems
686
12 Further Topics
691
12.1
in Signal Processing
Higher-Order Statistics691
12.1.1 Moments, Cumulants, and
Polyspectra / 12.1.2 Higher-
Order Moments and LTI Systems /
12.1.3 Higher-Order Moments of
Linear Signal Models
12.2
Blind Deconvolution 697
12.3
Filters—Blind Equalizers
Unsupervised Adaptive702
12.3.1 Blind Equalization /
12.3.2 Symbol Rate Blind
Equalizers / 12.3.3 Constant-
Modulus Algorithm
12.4
Equalizers
Fractionally Spaced709
12.4.1 Zero-Forcing Fractionally
Spaced Equalizers / 12.4.2
MMSE Fractionally Spaced
Equalizers / 12.4.3 Blind
Fractionally Spaced Equalizers
12.5
Signal Models
Fractional Pole-Zero716
12.5.1 Fractional Unit-Pole
Model / 12.5.2 Fractional Pole-
Zero Models: FPZ (p, d, q) /
12.5.3 Symmetric
Fractional Pole-Zero Processes
a-Stable
12.6
Signal Models
Self-Similar Random725
12.6.1 Self-Similar Stochastic
Processes / 12.6.2 Fractional
Brownian Motion / 12.6.3
Fractional Gaussian Noise /
12.6.4 Simulation of Fractional
Brownian Motions and Fractional
Gaussian Noises / 12.6.5
Estimation of Long Memory /
xvi
Contents
12.6.6 Fractional Lévy Stable
Motion
12.7
Summary 741
Problems
742
Appendix A Matrix Inversion
Lemma
745
Appendix B Gradients and
Optimization in
Complex Space
747
B.1
Gradient 747
B.2
Lagrange Multipliers 749
Appendix C M
ATLAB
Functions
753
Appendix D Useful Results
from Matrix Algebra
755
D.1
Vector Space
Complex-Valued755
Some Definitions
D.2
Matrices 756
D.2.1 Some Definitions / D.2.2
Properties of Square Matrices
D.3
Matrix
Determinant of a Square760
D.3.1 Properties of the
Determinant / D.3.2 Condition
Number
D.4
Unitary Matrices 762
D.4.1 Hermitian Forms after
Unitary Transformations / D.4.2
Significant Integral of Quadratic
and Hermitian Forms
D.5
Positive Definite Matrices 764
Appendix E Minimum Phase
Test for Polynomials
767
Bibliography
769
Index
787

No comments:

Post a Comment

LinkWithin

Related Posts Plugin for WordPress, Blogger...