 
Summary: 810 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 45, NO. 2, MARCH 1999
Noise Conditions for Prespecified Convergence
Rates of Stochastic Approximation Algorithms
Edwin K. P. Chong, Senior Member, IEEE,
IJeng Wang, Member, IEEE, and Sanjeev R. Kulkarni
Abstract We develop deterministic necessary and sufficient
conditions on individual noise sequences of a stochastic approximation
algorithm for the error of the iterates to converge at a given rate.
Specifically, suppose fngfngfng is a given positive sequence converging
monotonically to zero. Consider a stochastic approximation algorithm
xn+1 = xn 0an(Anxn 0bn) + anenxn+1 = xn 0an(Anxn 0bn) + anenxn+1 = xn 0an(Anxn 0bn) + anen, where fxngfxngfxng is the iterate
sequence, fangfangfang is the step size sequence, fengfengfeng is the noise sequence,
and x3x3x3 is the desired zero of the function f(x) = Ax 0bf(x) = Ax 0bf(x) = Ax 0b. Then, under
appropriate assumptions, we show that xn 0x3 = o(n)xn 0x3 = o(n)xn 0x3 = o(n) if and only
if the sequence fengfengfeng satisfies one of five equivalent conditions. These
conditions are based on wellknown formulas for noise sequences:
Kushner and Clark's condition, Chen's condition, Kulkarni and
Horn's condition, a decomposition condition, and a weighted averaging
condition. Our necessary and sufficient condition on fengfengfeng to achieve
a convergence rate of fngfngfng is basically that the sequence fen=ngfen=ngfen=ng
