Information-theoretic limits on sparse signal recovery:\\

Information-theoretic limits on sparse signal recovery:\\

Report Number
754
Authors
Wei Wang, Martin J. Wainwright and Kannan Ramchandran
Abstract

We study the information-theoretic limits of exactly recovering the support of a sparse signal using noisy projections defined by various classes of measurement matrices. Our analysis is high-dimensional in nature, in which the number of observations $\numobs$, the ambient signal dimension $\pdim$, and the signal sparsity $\kdim$ are all allowed to tend to infinity in a general manner. This paper makes two novel contributions. First, we provide sharper necessary conditions for exact support recovery using general (non-Gaussian) dense measurement matrices. Combined with previously known sufficient conditions, this result yields sharp characterizations of when the optimal decoder can recover a signal for various scalings of the sparsity $\kdim$ and sample size $\numobs$, including the important special case of linear sparsity ($\kdim = \Theta(\pdim)$) using a linear scaling of observations ($\numobs = \Theta(\pdim)$). Our second contribution is to prove necessary conditions on the number of observations $\numobs$ required for asymptotically reliable recovery using a class of $\spgam$-sparsified measurement matrices, where the measurement sparsity $\spgam(\numobs, \pdim, \kdim) \in (0,1]$ corresponds to the fraction of non-zero entries per row. Our analysis allows general scaling of the quadruplet $(\numobs, \pdim, \kdim, \spgam)$, and reveals three different regimes, corresponding to whether measurement sparsity has no effect, a minor effect, or a dramatic effect on the information-theoretic limits of the subset recovery problem.

PDF File