The problem of recovering sparse signals from a limited number of measurements is now ubiquitous in signal processing, statistics, and machine learning. A natural question of fundamental interest is that of what can and cannot be recovered in the presence of noise. This thesis provides a sharp characterization for the task of sparsity pattern recovery (also known as support recovery). Using tools from information theory, we find a sharp separation into two problem regimes -- one in which the problem is fundamentally noise-limited, and a more interesting one in which the problem is limited by the behavior of the sparse components themselves. This analysis allows us to identify settings where existing computationally efficient algorithms, such as the LASSO or approximate message passing, are optimal as well as other settings where these algorithms are highly suboptimal. We compare our results to predictions of phase transitions made via the powerful but heuristic replica method, and find that our rigorous bounds confirm some of these predictions.

The remainder of the thesis explores extensions of our bounds to various scenarios. We consider specially structured sampling matrices and show how such additional structure can make a key difference, analogous to the role of diversity in wireless communications. Finally, we illustrate how the new bounding techniques introduced in this thesis can be used to establish information-theoretic secrecy results for certain communication channel models that involve eavesdroppers.




Download Full History