PROJECT TITLE :
Decentralized RLS With Data-Adaptive Censoring for Regressions Over Large-Scale Networks - 2018
The deluge of networked information motivates the development of algorithms for computation- and communication-economical information processing. During this context, 3 data-adaptive censoring methods are introduced to considerably reduce the computation and communication overhead of decentralized recursive least-squares solvers. The first relies on alternating minimization and the stochastic Newton iteration to minimize a network-wide value, which discards observations with tiny innovations. In the resultant algorithm, each node performs local information-adaptive censoring to reduce computations while exchanging its local estimate with neighbors so as to consent on a network-wide answer. The communication value is any reduced by the second strategy, which prevents a node from transmitting its native estimate to neighbors when the innovation it induces to incoming information is minimal. In the third strategy, not only transmitting, but conjointly receiving estimates from neighbors is prohibited when data-adaptive censoring is in result. For all ways, a straightforward criterion is provided for selecting the threshold of innovation to achieve a prescribed average information reduction. The novel censoring-primarily based (C)D-RLS algorithms are proved convergent to the optimal argument within the mean-root deviation sense. Numerical experiments validate the effectiveness of the proposed algorithms in reducing computation and communication overhead.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here