Open Access System for Information Sharing

Login Library

 

Thesis
Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Proportionate Diffusion LMS Algorithms for Sparse Distributed Estimation

Title
Proportionate Diffusion LMS Algorithms for Sparse Distributed Estimation
Authors
임성혁
Date Issued
2019
Publisher
포항공과대학교
Abstract
In this dissertation, we study on improving the performance of diffusion least mean square (LMS) algorithm, which is a algorithm widely adopted because of its simplicity and efficiency for estimating a common vector in distributed estimation problem. The purpose of the distributed estimation is to estimate an unknown common vector using the measurement information which can be shared between neighboring sensor nodes. It is well known that when the vector to be estimated is sparse, the convergence speed of the LMS algorithm can be improved in some ways. In this regard, we propose a new diffusion LMS algorithm that utilizes adaptive gains in the adaptation stage and also propose an adaptive gain control method. We first define the proportionate diffusion LMS algorithm that assigns different gains to each tap of the weight vector and suggest how to update the gain optimally in the data sharing process with neighboring nodes. To do this, we calculate the componentwise expectation of the squared weight error and calculate the optimal gain to minimize it. This optimized gain is multiplied by the step size of the tap, so there is a similar aspect to the variable step size, but the difference is that it works optimally when estimating a sparse vector. In addition, we provide a stability condition of the proposed algorithm in the mean sense. In the simulation results, the proposed optimal gain proportionate diffusion LMS shows the fastest convergence speed in comparison with the existing sparsity exploiting diffusion LMS algorithms regardless of the sparsity of the vector to be estimated. We also propose a simplified version that reduces the computational complexity of the proposed optimal gain algorithm. The reason is that the proposed algorithm can be regarded as theoretically optimal, but it can cause too much overload in terms of computation. This simplified algorithm shows a similar amount of computation as the conventional diffusion LMS. We also provide a stability condition in the mean sense and also perform the mean square performance analysis of the algorithm. Simulation results show that the convergence speed of the proposed algorithm is fastest regardless of the sparsity of the vector compared to the conventional sparsity-constrained algorithms, although it is slower than the optimal gain method.
URI
http://postech.dcollection.net/common/orgView/200000180099
https://oasis.postech.ac.kr/handle/2014.oak/111799
Article Type
Thesis
Files in This Item:
There are no files associated with this item.

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Views & Downloads

Browse