Trading off complexity with communication costs in distributed adaptive learning via Krylov subspaces for dimensionality reduction

Symeon Chouvardas, Konstantinos Slavakis, Sergios Theodoridis

Research output: Contribution to journalArticlepeer-review

41 Scopus citations


In this paper, the problemof dimensionality reduction in adaptive distributed learning is studied. We consider a network obeying the ad-hoc topology, in which the nodes sense an amount of data and cooperate with each other, by exchanging information, in order to estimate an unknown, common, parameter vector. The algorithm, to be presented here, follows the set-theoretic estimation rationale; i.e., at each time instant and at each node of the network, a closed convex set is constructed based on the received measurements, and this defines the region in which the solution is searched for. In this paper, these closed convex sets, known as property sets, take the form of hyperslabs. Moreover, in order to reduce the number of transmitted coefficients, which is dictated by the dimension of the unknown vector, we seek for possible solutions in a subspace of lower dimension; the technique will be developed around the Krylov subspace rationale. Our goal is to find a point that belongs to the intersection of this infinite number of hyperslabs and the respective Krylov subspaces. This is achieved via a sequence of projections onto the property sets and the Krylov subspaces. The case of highly correlated inputs that degrades the performance of the algorithm is also considered. This is overcome via a transformation whichwhitens the input. The proposed schemes are brought in a decentralized form by adopting the combine-adapt cooperation strategy among the nodes. Full convergence analysis is carried out and numerical tests verify the validity of the proposed schemes in different scenarios in the context of the adaptive distributed system identification task.

Original languageEnglish (US)
Article number6468063
Pages (from-to)257-273
Number of pages17
JournalIEEE Journal on Selected Topics in Signal Processing
Issue number2
StatePublished - Apr 2013


  • Adaptive distributed learning
  • Diffusion
  • Krylov subspaces
  • Projections


Dive into the research topics of 'Trading off complexity with communication costs in distributed adaptive learning via Krylov subspaces for dimensionality reduction'. Together they form a unique fingerprint.

Cite this