Abstract
In this paper, we first study ℓq minimization and its associated iterative reweighted algorithm for recovering sparse vectors. Unlike most existing work, we focus on unconstrained ℓq minimization, for which we show a few advantages on noisy measurements and/or approximately sparse vectors. Inspired by the results in [Daubechies et al., Comm. Pure Appl. Math., 63 (2010), pp. 1-38] for constrained ℓq minimization, we start with a preliminary yet novel analysis for unconstrained ℓq minimization, which includes convergence, error bound, and local convergence behavior. Then, the algorithm and analysis are extended to the recovery of low-rank matrices. The algorithms for both vector and matrix recovery have been compared to some state-of-the-art algorithms and show superior performance on recovering sparse vectors and low-rank matrices.
Original language | English (US) |
---|---|
Pages (from-to) | 927-957 |
Number of pages | 31 |
Journal | SIAM Journal on Numerical Analysis |
Volume | 51 |
Issue number | 2 |
DOIs | |
State | Published - 2013 |
Externally published | Yes |
Keywords
- Compressed sensing
- Iterative reweighted least squares
- Low-rank matrix recovery
- Matrix completion
- Sparse optimization
- Sparse vector recovery
- ℓ minimization