TY - GEN
T1 - Scaled gradients on Grassmann manifolds for matrix completion
AU - Ngo, Thanh T.
AU - Saad, Yousef
PY - 2012
Y1 - 2012
N2 - This paper describes gradient methods based on a scaled metric on the Grassmann manifold for low-rank matrix completion. The proposed methods significantly improve canonical gradient methods, especially on ill-conditioned matrices, while maintaining established global convegence and exact recovery guarantees. A connection between a form of subspace iteration for matrix completion and the scaled gradient descent procedure is also established. The proposed conjugate gradient method based on the scaled gradient outperforms several existing algorithms for matrix completion and is competitive with recently proposed methods.
AB - This paper describes gradient methods based on a scaled metric on the Grassmann manifold for low-rank matrix completion. The proposed methods significantly improve canonical gradient methods, especially on ill-conditioned matrices, while maintaining established global convegence and exact recovery guarantees. A connection between a form of subspace iteration for matrix completion and the scaled gradient descent procedure is also established. The proposed conjugate gradient method based on the scaled gradient outperforms several existing algorithms for matrix completion and is competitive with recently proposed methods.
UR - http://www.scopus.com/inward/record.url?scp=84877732145&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84877732145&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:84877732145
SN - 9781627480031
T3 - Advances in Neural Information Processing Systems
SP - 1412
EP - 1420
BT - Advances in Neural Information Processing Systems 25
T2 - 26th Annual Conference on Neural Information Processing Systems 2012, NIPS 2012
Y2 - 3 December 2012 through 6 December 2012
ER -