Enhanced incremental LMS with norm constraints for distributed in-network estimation

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

30 Scopus Citations
View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Original languageEnglish
Pages (from-to)373-385
Journal / PublicationSignal Processing
Volume94
Issue number1
Online published11 Jul 2013
Publication statusPublished - Jan 2014

Abstract

This paper addresses the problem of distributed in-network estimation for a vector of interest, which is sparse in nature. To exploit the underlying sparsity of the considered vector, the ℓ1 and ℓ0 norms are incorporated into the quadratic cost function of the standard distributed incremental least-mean-square (DILMS) algorithm, and some sparse DILMS (Sp-DILMS) algorithms are proposed correspondingly. The performances of the proposed Sp-DILMS algorithms in the mean and mean-square derivation are analyzed. Mathematical analyses show that the Sp-DILMS outperforms the DILMS, if a suitable intensity of the zero-point attractor is selected. Considering that such intensity may not be easily determined in real cases, a new adaptive strategy is designed for its selection. Its effectiveness is verified by both theoretical analysis and numerical simulations. Even though the criterion for intensity selection is derived from the case that the observations are white and Gaussian, simulation results show that it still provides an empirical good choice if the observations are correlated regression vectors. © 2013 Published by Elsevier B.V. All rights reserved.

Research Area(s)

  • Distributed estimation, Incremental least-mean-square, Network, Norm constraint, Sparse