TY - GEN

T1 - Decentralized low-rank matrix completion

AU - Ling, Qing

AU - Xu, Yangyang

AU - Yin, Wotao

AU - Wen, Zaiwen

PY - 2012/10/23

Y1 - 2012/10/23

N2 - This paper introduces algorithms for the decentralized low-rank matrix completion problem. Assume a low-rank matrix W = [W 1,W 2, ...,W L]. In a network, each agent ℓ observes some entries of W ℓ. In order to recover the unobserved entries of W via decentralized computation, we factorize the unknown matrix W as the product of a public matrix X, common to all agents, and a private matrix Y = [Y 1,Y 2, ...,Y L], where Y ℓ is held by agent ℓ. Each agent ℓ alternatively updates Y ℓ and its local estimate of X while communicating with its neighbors toward a consensus on the estimate. Once this consensus is (nearly) reached throughout the network, each agent ℓ recovers W ℓ = XY ℓ, and thus W is recovered. The communication cost is scalable to the number of agents, and W ℓ and Y ℓ are kept private to agent ℓ to a certain extent. The algorithm is accelerated by extrapolation and compares favorably to the centralized code in terms of recovery quality and robustness to rank over-estimate.

AB - This paper introduces algorithms for the decentralized low-rank matrix completion problem. Assume a low-rank matrix W = [W 1,W 2, ...,W L]. In a network, each agent ℓ observes some entries of W ℓ. In order to recover the unobserved entries of W via decentralized computation, we factorize the unknown matrix W as the product of a public matrix X, common to all agents, and a private matrix Y = [Y 1,Y 2, ...,Y L], where Y ℓ is held by agent ℓ. Each agent ℓ alternatively updates Y ℓ and its local estimate of X while communicating with its neighbors toward a consensus on the estimate. Once this consensus is (nearly) reached throughout the network, each agent ℓ recovers W ℓ = XY ℓ, and thus W is recovered. The communication cost is scalable to the number of agents, and W ℓ and Y ℓ are kept private to agent ℓ to a certain extent. The algorithm is accelerated by extrapolation and compares favorably to the centralized code in terms of recovery quality and robustness to rank over-estimate.

KW - decentralized algorithm

KW - low-rank matrix completion

KW - matrix factorization

KW - privacy protection

UR - http://www.scopus.com/inward/record.url?scp=84867595624&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84867595624&partnerID=8YFLogxK

U2 - 10.1109/ICASSP.2012.6288528

DO - 10.1109/ICASSP.2012.6288528

M3 - Conference contribution

AN - SCOPUS:84867595624

SN - 9781467300469

T3 - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings

SP - 2925

EP - 2928

BT - 2012 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2012 - Proceedings

T2 - 2012 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2012

Y2 - 25 March 2012 through 30 March 2012

ER -