Near-optimal recovery using iterative algorithms in high-dimensional regression with random designs
Antony
Joseph

Yale University

Thursday, February 2, 2012 - 3:30pm

We provide theoretical analysis of iterative algorithms for two problems in high-dimensional regression. In the first, a sparse linear model with a specific coefficient structure provides a framework for a problem in communication. We show that the algorithm has optimal performance when compared to information-theoretic limits. This provides theoretically provable, low computational complexity communication systems based on our statistical framework. For the second, we analyze the Orthogonal Matching Pursuit algorithm for the general statistical problem of recovery in high-dimensions. We show that the thresholds of recovery are similar to that shown in recent results on the Lasso. Further, oracle inequalities for estimation under l_2-loss are also demonstrated.