CAAM 654 - Topics in Optimization (Spring 2015)

Sparse Structure Recovery: Theory and Computation


Instructor: Paul E. Hand

Office: Duncan 3086

Email: hand [at]

Lectures: MW, 4-5:30pm in Rayzor 106

Office Hours: Mondays 1-3pm and by appointment


In many signal recovery problems, one attempts to find an n dimensional signal with as few measurements as possible. If the signal has some structure, such as sparsity, then it can sometimes be recovered with many fewer than n measurements. The recovery process often involves l1 minimization or semidefinite programming. This course will focus on rigorous recovery theorems and computational algorithms for a variety of structured signal recovery problems. The objectives of this course are to familiarize the students with the important ideas and proof techniques in sparse structure recovery.


Books: This course will cover Chapter 1 and Chapter 5 of Compressed Sensing: Theory and Applications by Eldar and Kutyniok. It will also cover Chapter 7 of Sparse Image and Signal Processing by Starck, Murtagh, and Fadili. It will also cover papers that are freely available from the arXiv.

Prerequisites: Familiarity with Matlab, Analysis (CAAM 501 or MATH 302 or similar), Optimization (CAAM 571 or similar), Probability Theory (e.g. multivariate Gaussians, Chi-squared random variables, central limit theorems)

Class structure and grading: Before each day of class, the instructor will assign book sections and/or research papers for you to read. Accompanying the reading will be a set of questions. You will be expected to complete the reading and write out answers to each of these questions by the beginning of class. Class will then be a discussion of these questions and the reading. You will be expected to share your thoughts and questions regarding the readings in class. Students taking the class for 3 credits are expected (1) to submit well-thought answers to the assigned questions; (2) present their answers upon request; and (3) participate in class discussions. The grade will be based on the degree of honest effort in all three of these aspects. Students enrolling for 1 credit are expected to attend class.

Disabilities: Any student with a disability needing academic accommodations is requested to speak with me as soon as possible. All discussions will remain confidential. Students should also contact Disability Support Services in the Ley Student center.



Topics and dates are tentative
Day Topics Reading Assignment Class notes
Jan 12Intro to compressed sensing (CS), least squaresNotes
Jan 14Spark, coherenceChapter 1. Questions. Notes
Jan 21Null space propertyBlog post. QuestionsNotes
Jan 26NSP and not-exactly-sparse signalsNotes
Jan 28RIP recovery guaranteesQuestionsNotes
Feb 2Gaussian matrices have RIPQuestionsNotes
Feb 4Singular values of Gaussian matricesQuestionsNotes
Feb 9Singular values of matrices with subgaussian rowsNotes
Feb 16Heavy-tailed random variablesNotes
Feb 18Singular values of matrices with heavy-tailed rowNotes
Feb 23Dual certification and Fuch's criterionReading. QuestionsNotes
Mar 9Gaussian widthsNotes
Mar 16Gradient descent and proximal operatorsNotes
Mar 18Dual methods and ADMMBoyd's slides on ADMMNotes
Mar 23Greedy algorithms for CSChapter 8 of Eldar and KutyniokNotes
Mar 25Phase retrievalNotes
Apr 6Phase retrievalNotes
Apr 8Phase retrievalNotes
Apr 15Phase retrievalNotes
Apr 20Sparse PCA
Apr 22Sparse PCAReading and Questions