Call Today 716.688.4675

Acoustic echo cancellation with the NLMS algorithm

The first go-to algorithm for acoustic echo cancellation (AEC) is a the gradient descent least mean squared (LMS) algorithm. The LMS algorithm however has a challenge in the convergence time making it not quite suitable for real time operating system which require fast convergence of the algorithm with low computation time. An adaptation of the LMS algorithm, the normalized LMS algorithm is used to mitigate this challenge. Consider the LMS algorithm which proceeds as:

Single line AEC architecture
Figure 1: Single line AEC architecture

\hat{h}[i]_{n+1} = \hat{h}[i]_{n}+ 2 \mu \left(r_k-\sum\limits_{m=1}^M \hat{h}[m]_{n}x[k-m+1] \right) x[k-i] , \forall i

where \hat{h}[i] are the filter taps being estimated, r_k= \sum\limits_{m=1}^M h[m]x[k-m+1] + v[k]v[k] is additive noise and \mu is the descent parameter. The NLMS seeks to optimize the choice of the value of \mu used. The square of the error signal, given as e[i]_{n} =\hat{h}[i]_{n+1} - \hat{h}[i]_{n} is minimized for the optimal value such that the constraint r_k= \sum\limits_{m=1}^M \hat{h}[m]_{n+1}x[k-m+1]. A new cost function thus arises, given by

J(\hat{{\bf h}}, \lambda) = \sum\limits_{m=1}^M (\hat{h}[m]_{n+1} - \hat{h}[m]_{n})^2 + \lambda \left(r_k -\sum\limits_{m=1}^M \hat{h}[m]_{n+1}x[k-m+1]\right) \frac{\partial{J(\hat{{\bf h}}, \lambda)}}{\partial{\hat{h}[i]_{n+1}}} = 2 (\hat{h}[i]_{n+1} - \hat{h}[i]_{n}) - \lambda x[k-i] which leads to \hat{h}[i]_{n+1} = \hat{h}[i]_{n} +\frac{1}{2} \lambda x[k-i]

Further, the partials with respect to the Lagrange multipliers are \frac{\partial{J(\hat{{\bf h}}, \lambda)}}{\partial{\lambda} }=r_k -\sum\limits_{m=1}^M \left( \hat{h}[m]_{n} +\frac{1}{2} \lambda x[k-m+1] \right) x[k-m+1] \lambda = 2 \frac{r_k -\sum\limits_{m=1}^M \hat{h}[m]_{n} x[k-m+1]}{\sum\limits_{m=1}^M x[k-m+1] x[k-m+1]}

Thus the complete solution will proceed as:\hat{h}[i]_{n+1} = \hat{h}[i]_{n} + \frac{1}{\sum\limits_{m=1}^M x[k-m+1]^2 } \left(r_k -\sum\limits_{m=1}^M \hat{h}[m]_{n} x[k-m+1]\right) x[k-i]

The NLMS update equation looks similar to the original LMS update equation with 2\mu = (\sum\limits_{m=1}^M x[k-m+1] ^2 )^{-1}. In most cases, a small constant \hat{\mu} is included for a more graceful descent of the algorithm and also to satisfy the Wolfe conditions. A sample result for this algorithm is shown in Figure 2 below.

Performance of NLMS AEC

Figure 2: Performance of NLMS AEC

VOCAL Technologies offers custom designed solutions for beamforming with a robust voice activity detector, acoustic echo cancellation and noise suppression. Our custom implementations of such systems are meant to deliver optimum performance for your specific beamforming task. Contact us today to discuss your solution!

 

More Information

VOCAL Technologies, Ltd.
520 Lee Entrance, Suite 202
Amherst New York 14228
Phone: +1-716-688-4675
Fax: +1-716-639-0713
Email: sales@vocal.com