Information-theoretic limits on sparsity recovery in the high-dimensional and noisy setting

Information-theoretic limits on sparsity recovery in the high-dimensional and noisy setting

Report Number
725
Authors
Martin J. Wainwright
Abstract

The problem of recovering the sparsity pattern of a fixed but unknown vector $\beta^* \in \real^p$ based on a set of $n$ noisy observations arises in a variety of settings, including subset selection in regression, graphical model selection, signal denoising, compressive sensing, and constructive approximation. Of interest are conditions on the model dimension $p$, the sparsity index $s$ (number of non-zero entries in $\beta^*$), and the number of observations $n that are necessary and/or sufficient to ensure asymptotically perfect recovery of the sparsity pattern. This paper focuses on the information-theoretic limits of sparsity recovery: in particular, for a noisy linear observation model based on measurement vectors drawn from the standard Gaussian ensemble, we derive both a set of sufficient conditions for asymptotically perfect recovery using the optimal decoder, as well as a set of necessary conditions that \emph{any} decoder, regardless of its computational complexity, must satisfy for perfect recovery. This analysis of optimal decoding limits complements our previous work~\cite{Wainwright06a} on sharp thresholds for sparsity recovery using the Lasso ($\ell_1$-constrained quadratic programming) with Gaussian measurement ensembles.

PDF File