The course focuses on the basic methods for processing and analyzing data with deterministic and probabilistic tools. It is a preliminary course for deep learning and convolutional neural networks. This is a 3.0 credit course.
The course will explain how to digitize signals and data in a computer, how to represent them in different bases, and how to use these representations efficiently for various signal processing tasks.
Covered topics include signal quantization and sampling for bit-allocation, system and data representations including but not limited to the Fourier representation, optimality of the Fourier representation, functional maps, convolutions, compression, dimensionality reduction, principal component analysis, restoration of blurred deterministic or randomly distributed data with or without random noise via filtering. Signals and systems are analyzed in the continuous and discrete settings.
The lectures can be taught in English upon request. The tutorials are taught in English.
This course requires an extensive use of linear algebra, calculus, and probability, along with some basic knowledge of differential equations.
There will be 4 homework assignments published approximately every 3 weeks during the semester, consisting of theoretical and programming questions. Rigorous mathematical proofs and reasoning are required for theoretical questions. Programming must be done in Python. Submission is in pairs only.
The course grade will be based on 4 homework exercises (30%) and a final exam (70%).