## Generalized Gradient Adaptive Step Sizes For Stochastic Gradient Adaptive Filters (1995)

Venue: | IEEE International Conf. Acoust., Speech., Signal Processing |

Citations: | 2 - 2 self |

### BibTeX

@INPROCEEDINGS{Douglas95generalizedgradient,

author = {S.C. Douglas},

title = {Generalized Gradient Adaptive Step Sizes For Stochastic Gradient Adaptive Filters},

booktitle = {IEEE International Conf. Acoust., Speech., Signal Processing},

year = {1995},

pages = {1396--1399}

}

### OpenURL

### Abstract

In this paper, we derive new adaptive step size algorithms for two general classes of modified stochastic gradient adaptive filters that include the sign-error, sign-data, sign-sign, and normalized gradient adaptive filters as specific cases. These computationallysimple parameter adjustment algorithms are based on stochastic gradient approximations of steepest descent procedures for the unknown parameters. Analyses of the algorithms show that the stationary points of the steepest descent procedures yield the optimum step size values at each time instant as obtained from statistical analyses of the adaptive filter updates. Simulations verify the theoretical results and indicate that near-optimal tracking performance can be obtained from each of the adaptive step size algorithms without any knowledge of the rate of change of the unknown system. 1. INTRODUCTION Least-mean-square (LMS) adaptive finite-impulse-response (FIR) filters have proven to be extremely useful in a number of signal...

### Citations

30 |
A stochastic gradient adaptive filter with gradient adaptive step size
- Mathews, Xie
- 1993
(Show Context)
Citation Context ...to the behavior of recursive least-squares adaptive filters. Consequently, several computationallysimple methods for improving the convergence properties of the LMS adaptive filter have been proposed =-=[1, 2, 3, 4]-=-. In general, these methods specify a procedure for adjusting the algorithm step size to obtain fast convergence when the error in the adaptive filter coefficients is large and to obtain a small mean-... |

19 | A family of normalized LMS algorithms
- Douglas
- 1994
(Show Context)
Citation Context ... including the sign-error, sign-data, sign-sign, and other quantized state adaptive algorithms [6]; the normalized LMS (NLMS), signdata NLMS, sign-error NLMS, and other normalized adaptive algorithms =-=[9, 10, 11, 12]-=-; and Newton-type algorithms. For the algorithm families in (1) and (2), we derive simple data-adaptive procedures for adjusting the step size parametersk and parameter fi k , respectively, to achieve... |

17 |
A Variable Step (VS) Adaptive Filter Algorithm
- Harris, Chabries, et al.
- 1986
(Show Context)
Citation Context ...to the behavior of recursive least-squares adaptive filters. Consequently, several computationallysimple methods for improving the convergence properties of the LMS adaptive filter have been proposed =-=[1, 2, 3, 4]-=-. In general, these methods specify a procedure for adjusting the algorithm step size to obtain fast convergence when the error in the adaptive filter coefficients is large and to obtain a small mean-... |

16 | Normalized data nonlinearities for LMS adaptation - Douglas, Meng - 1994 |

15 |
Adaptive filter performance with nonlinearities in the correlation multiplier
- Duttweiler
- 1982
(Show Context)
Citation Context ...forward neural networks, where it has been termed the "delta-delta rule" [5]. In some situations, it is desirable to modify the LMS algorithm update, either to simplify its implementation in=-= hardware [6, 7]-=- or to improve its robustness and performance in the presence of non-Gaussian or time-varying signal statistics [8, 9, 10, 11]. Adaptive step size procedures for these modified algorithms remain large... |

12 |
Stochastic gradient adaptation under general error criteria
- Douglas, Meng
- 1994
(Show Context)
Citation Context ...odify the LMS algorithm update, either to simplify its implementation in hardware [6, 7] or to improve its robustness and performance in the presence of non-Gaussian or time-varying signal statistics =-=[8, 9, 10, 11]-=-. Adaptive step size procedures for these modified algorithms remain largely unexplored. In this paper, we present new adaptive step size procedures for the two families of modified stochastic gradien... |

8 |
Tracking analysis of sign algorithm in nonstationary environments
- Cho, Mathews
- 1990
(Show Context)
Citation Context ...forward neural networks, where it has been termed the "delta-delta rule" [5]. In some situations, it is desirable to modify the LMS algorithm update, either to simplify its implementation in=-= hardware [6, 7]-=- or to improve its robustness and performance in the presence of non-Gaussian or time-varying signal statistics [8, 9, 10, 11]. Adaptive step size procedures for these modified algorithms remain large... |

6 |
Goal seeking components for adaptive intelligence: An initial assessment
- Barto, Sutton
- 1981
(Show Context)
Citation Context ...y, the algorithm was first developed to improve the convergence properties of the backpropagation algorithm for multilayer feedforward neural networks, where it has been termed the "delta-delta r=-=ule" [5]-=-. In some situations, it is desirable to modify the LMS algorithm update, either to simplify its implementation in hardware [6, 7] or to improve its robustness and performance in the presence of non-G... |

4 |
Adaptive filtering for non-Gaussian stable process
- Arikan, AE, et al.
(Show Context)
Citation Context ...odify the LMS algorithm update, either to simplify its implementation in hardware [6, 7] or to improve its robustness and performance in the presence of non-Gaussian or time-varying signal statistics =-=[8, 9, 10, 11]-=-. Adaptive step size procedures for these modified algorithms remain largely unexplored. In this paper, we present new adaptive step size procedures for the two families of modified stochastic gradien... |

3 |
A study on the fast convergence algorithm for the LMS adaptive 3lter design
- Shin, Lee
- 1985
(Show Context)
Citation Context ...to the behavior of recursive least-squares adaptive filters. Consequently, several computationallysimple methods for improving the convergence properties of the LMS adaptive filter have been proposed =-=[1, 2, 3, 4]-=-. In general, these methods specify a procedure for adjusting the algorithm step size to obtain fast convergence when the error in the adaptive filter coefficients is large and to obtain a small mean-... |

2 | Stochastic gradient algorithms with a gradientadaptive and limited step-size - Sugiyama - 1994 |

1 | Adaptive cancellation of geomagnetic background noise using a sign-error normalized LMS algorithm
- Freire, Douglas
- 1993
(Show Context)
Citation Context ...odify the LMS algorithm update, either to simplify its implementation in hardware [6, 7] or to improve its robustness and performance in the presence of non-Gaussian or time-varying signal statistics =-=[8, 9, 10, 11]-=-. Adaptive step size procedures for these modified algorithms remain largely unexplored. In this paper, we present new adaptive step size procedures for the two families of modified stochastic gradien... |

1 |
The optimum scalar data nonlinearity in LMS adaptation for arbitrary i.i.d. inputs
- Douglas, Meng
- 1992
(Show Context)
Citation Context ...ptimality properties, if any, do these algorithms possess? To answer this question, we first determine the optimum step size sequences for these algorithms using the statistical analyses presented in =-=[8, 9, 13]-=-. We also assume that the desired response signal is generated from a nonstationary model such that dk = W T opt;k Xk + nk (13) W opt;k+1 = Wopt;k + Mk ; (14) where nk is an i.i.d. noise sequence with... |