EE4C03 Statistical digital signal processing and modeling

Introduction

This is a second course in discrete-time signal processing, with a focus on random signals. It provides a comprehensive treatment of signal processing algorithms for modeling discrete-time signals, designing optimum filters, estimation of the power spectrum of a random process, and implementing adaptive filters. These are important topics that are frequently encountered in professional engineering, and major applications such as digital communication, array processing, biomedical signal processing and multimedia (speech and audio processing, image processing).

The course provides a framework that connects signal models to filter structures, formulates filter design as an optimization problem, solved in turn via linear algebra techniques applied to structured matrices. The connections between these topics are strong, and provide insights that can also be used in other disciplines.

The course treats:

  • Brief refresher on DSP, linear algebra and random processes;
  • Linear prediction, parametric methods such as Pade approximation, Prony's method and ARMA models;
  • The Yule-Walker equations;
  • Wiener and Kalman filtering;
  • Spectrum estimation (nonparametric and parametric), frequency estimation (Pisarenko, MUSIC algorithm);
  • Adaptive filtering (LMS, RLS).
The course includes a take-home lab assignment (size 1 EC = 28 hours) which can be done in groups of 2 students. Several track-dependent assignments are offered. Furthermore, Matlab computer exercises are available some of which will be carried out in class.

The course complements ET4386 Estimation and detection and ET 4147 Signal Processing for Communications.

This is a common course for the MSc EE (all tracks).

Preliminary knowledge

To follow the course with profit, you will need the background knowledge provided by an elementary course in Signals and Systems, in particular you need to know what is a discrete-time Fourier transform (DTFT), a z-transform, and their properties. This can be found, e.g., in J.G. Proakis and D.G. Manolakis (Prentice Hall, 2007), chapters 2-4. You may consult the course EE2S11 Signals and Systems (video lectures in english available). In addition, you need basic notions of random signals, as shown in the course EE2S31 Signal Processing, and of Linear Algebra.

Exam

The exam is written, open-book. For the exam, you can bring the book (or a print-out of the pdf) and copies of the slides. No written notes or other materials are allowed.

The lab assignment is completed with a compact report. Passing the lab assignment is compulsory for the exam grade to become valid. Moreover, the assignment is graded and counts for 20% of your final grade.

Lab assignments

The course contains a compulsory lab assignment worth 1 EC (28 hours, 20% of your final grade). The assignment is done in groups of 2 students. Several track-dependent assignments are offered, which can be found on Brightspace. Indicate your selection via the 'group enroll' button in Brightspace.

Each assignment typically consists of a problem description and a data set. Using the tools offered in the course, develop matlab algorithms to 'solve' the problem. The assignment is concluded with a short lab report. (If you wish, you could probably extend this report into a more fancy essay for EE4C01 Introduction to EE.) General remarks are:

  • Make sure that the report has a connection to the course. The title of the course has 'statistical' and 'modeling', and you are expected to link the practical problem to the modeling and analysis techniques covered by the course. I.e., to have a structured approach.
  • Make sure to describe the data model. Given a data set, how do you model it? What are the random parameters, what are the model assumptions? (Also, define/explain notation, and use it consistently.)
  • Given the data model, what are possible approaches to estimate its parameters? This is linked to the properties of the data model. Often, you will be able to find algorithms related to the specified problem in literature, but take a step back and think of alternatives.
  • When using an algorithm, consider also its properties. E.g., when using an FFT, what is the expected resolution and the variance of the spectrum estimate, and is that sufficient for its purpose? What are the parameters in the algorithm and how did you select them? Are the Matlab graphs consistent with theory (= the model)? Do you need to apply windowing?
  • Use a clear and concise reporting style: clear explanations, consistent use of notation, explanation of the graphs (what is shown, what do you observe, what do you conclude), conclusions and reference list.

Book

Monson H. Hayes, "Statistical digital signal processing and modeling", John Wiley and Sons, New York, 1996. ISBN: 0-471 59431-8

Collegerama

The course has been videotaped in 2015 for viewing in Collegerama. You can view old episodes by using the direct links in the table below.

(The first recording EE4C03_01 has failed to record some of the slides. The recording EE4C03_07 skipped the first minutes due to video recording problems. The recording of EE4C03_10 was made in 2016. In 2017 we inserted an additional introductory class on random processes that has not been videotaped; there may be overlap with EE4C03_03.)

Instructors

dr. Geethu Joseph (GJ), prof.dr.ir. Geert Leus (GL), prof.dr.ir. Alle-Jan van der Veen (AJ).

Schedule

The schedule for 2023 is as follows. Classes are on Wednesdays 13:45-15:30 and Fridays 8:45-10:30. The slides as well as the latest recordings can be found on Brightspace.


Date

Book Collegerama
0. Wed 6 Sep GJ Refersher linear algebra Ch.2 EE4C03_02
2020_02
1. Fri 8 Sep GJ Test on linear algebra ; Course introduction
2. Wed 13 Sep GJ Background: z-transform, DTFT principles, optimization, ... EE4C03_01
2020_01
3. Fri 15 Sep GJ Refresher random processes: power spectra, spectral factorization, Yule-Walker equations Ch.3 EE4C03_02
4. Wed 20 Sep GL Signal modeling (deterministic): Pade, Prony Ch.4.1-4.4, 4.6 EE4C03_03
5. Fri 22 Sep GL Signal modeling (stochastic): all-pole modeling, ARMA models Ch.4.7 EE4C03_04
6. Wed 27 Sep GL Examples and exercises EE4C03_05
2020_05
7. Fri 29 Sep GL/GJ Trial exam questions + computer exercises: Pade, Prony, ARMA
8. Wed 4 Oct GJ Nonparametric spectrum estimation Ch.8.2 (skip 8.2.6) EE4C03_08
9. Fri 6 Oct GJ Minimum variance spectrum estimation, Parametric spectrum estimation, Frequency estimation: Pisarenko, MUSIC Ch.8.3, 8.5, 8.6 EE4C03_09
2020_09
10. Wed 11 Oct GJ Examples and exercises
11. Fri 13 Oct GL/GJ Trial exam questions + computer exercises: nonparametric spectrum estimation, minimum variance spectrum estimation, MUSIC
12. Wed 18 Oct GL Optimal FIR filtering: The Wiener filter, prediction, deconvolution, ... Ch.7 (skip 7.4) EE4C03_10
13. Fri 20 Oct GL Adaptive filters: LMS Ch.9.1, 9.2 (skip 9.2.7, 9.2.8) EE4C03_11
14. Wed 25 Oct GL Adaptive filters: RLS, the Kalman filter Ch.9.4; Ch.7.4 EE4C03_12
15. Fri 27 Oct GL Examples and exercises EE4C03_13

Exams

Note that the exams are open book, but you must be very familiar with the material to be able to solve the questions in time. There are 3 types of questions: (1) applying the theory directly, e.g. compute a covariance sequence; (2) show more in-depth insight into one aspect of the theory, e.g. prove a certain property; (3) extend or apply the theory to a new situation. Train by solving many exercise questions from the book as well as previous exams.

Exercises

The book contains many exercises. Below is a list of suggested problems. A pdf of the Solutions Manual can probably be found on the internet.
Chapter 3: 3.2; 3.3; 3.8; 3.11; 3.13; 3.25
Chapter 4: 4.1; 4.2; 4.4; 4.5; 4.12; 4.14; 4.18; 4.20; 4.23
Chapter 5: 5.5; 5.6; 5.8; 5.11; 5.14; 5.18; 5.20
Chapter 7: 7.2; 7.5; (7.7; 7.12); 7.15; 7.17; 7.18 ; 7.20
Chapter 8: 8.1; 8.2; 8.3; 8.5; 8.22 (b), (c)
Chapter 9: 9.1; 9.3; 9.7; 9.8; 9.10; 9.11; 9.16; 9.17; 9.19