In this paper, we consider the problem of solving large scale systems of linear equations over a network of interconnected computing nodes. We assume the communication links among the nodes are randomly switching over time and model the intermittent communication with Bernoulli process. We propose an iterative algorithm to solve the problem based on a distributed optimization framework and the idea to use the last received information at each node. We provide convergence analysis of the algorithm and drive computational as well as analytical convergence conditions based on an intrinsic system decomposition and the application of singular perturbation and stochastic control theory. A numerical example verifies that the algorithm is robust to both stochastic link switches and additive uncertainties.
|Original language||English (US)|
|Title of host publication||2016 American Control Conference, ACC 2016|
|Publisher||Institute of Electrical and Electronics Engineers Inc.|
|Number of pages||6|
|State||Published - Jul 28 2016|
|Event||2016 American Control Conference, ACC 2016 - Boston, United States|
Duration: Jul 6 2016 → Jul 8 2016
|Name||Proceedings of the American Control Conference|
|Other||2016 American Control Conference, ACC 2016|
|Period||7/6/16 → 7/8/16|
Bibliographical noteFunding Information:
This work was not supported under NSF grant CNS-1239319.
© 2016 American Automatic Control Council (AACC).