TY - JOUR
T1 - Decentralized RLS with data-adaptive censoring for regressions over large-scale networks
AU - Wang, Zifeng
AU - Yu, Zheng
AU - Ling, Qing
AU - Berberidis, Dimitris
AU - Giannakis, Georgios B.
N1 - Publisher Copyright:
© 1991-2012 IEEE.
PY - 2018/3/15
Y1 - 2018/3/15
N2 - The deluge of networked data motivates the development of algorithms for computation- and communication-efficient information processing. In this context, three data-adaptive censoring strategies are introduced to considerably reduce the computation and communication overhead of decentralized recursive least-squares solvers. The first relies on alternating minimization and the stochastic Newton iteration to minimize a network-wide cost, which discards observations with small innovations. In the resultant algorithm, each node performs local data-adaptive censoring to reduce computations while exchanging its local estimate with neighbors so as to consent on a network-wide solution. The communication cost is further reduced by the second strategy, which prevents a node from transmitting its local estimate to neighbors when the innovation it induces to incoming data is minimal. In the third strategy, not only transmitting, but also receiving estimates from neighbors is prohibited when data-adaptive censoring is in effect. For all strategies, a simple criterion is provided for selecting the threshold of innovation to reach a prescribed average data reduction. The novel censoring-based (C)D-RLS algorithms are proved convergent to the optimal argument in the mean-root deviation sense. Numerical experiments validate the effectiveness of the proposed algorithms in reducing computation and communication overhead.
AB - The deluge of networked data motivates the development of algorithms for computation- and communication-efficient information processing. In this context, three data-adaptive censoring strategies are introduced to considerably reduce the computation and communication overhead of decentralized recursive least-squares solvers. The first relies on alternating minimization and the stochastic Newton iteration to minimize a network-wide cost, which discards observations with small innovations. In the resultant algorithm, each node performs local data-adaptive censoring to reduce computations while exchanging its local estimate with neighbors so as to consent on a network-wide solution. The communication cost is further reduced by the second strategy, which prevents a node from transmitting its local estimate to neighbors when the innovation it induces to incoming data is minimal. In the third strategy, not only transmitting, but also receiving estimates from neighbors is prohibited when data-adaptive censoring is in effect. For all strategies, a simple criterion is provided for selecting the threshold of innovation to reach a prescribed average data reduction. The novel censoring-based (C)D-RLS algorithms are proved convergent to the optimal argument in the mean-root deviation sense. Numerical experiments validate the effectiveness of the proposed algorithms in reducing computation and communication overhead.
KW - Decentralized estimation
KW - data-adaptive censoring
KW - networks
KW - recursive least-squares (RLS)
UR - http://www.scopus.com/inward/record.url?scp=85040913339&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85040913339&partnerID=8YFLogxK
U2 - 10.1109/TSP.2018.2795594
DO - 10.1109/TSP.2018.2795594
M3 - Article
AN - SCOPUS:85040913339
SN - 1053-587X
VL - 66
SP - 1634
EP - 1648
JO - IEEE Transactions on Signal Processing
JF - IEEE Transactions on Signal Processing
IS - 6
M1 - 8264782
ER -