Complete Communications Engineering

Particle Swarm Optimization (PSO) is an optimization algorithm based on the idea that a collective is more than the sum of its parts. It relies on communication between particles to search for an optimal value of a function. The particles remember the best location at which they have been, individually and as a group, and are drawn back to those locations. This swarm feature allows more of the space to be searched which decreases the chances of becoming caught in a local optimum. This is the major advantage of PSO as compared to gradient descent and other convex optimization techniques. It also tends to outperform other algorithms designed to optimize functions with several optima such as genetic algorithms.

The problem with PSO is that for functions with large numbers of local optima, the swarm can still become trapped in local optima. Euclidean Particle Swarm Optimization (EPSO) is designed to help overcome this limitation. This adds to the original PSO by keeping track of how many iterations has passed since the global best value has been updated. Once this number hits a certain level, it is assumed that a local optima has been found, and the particles are all flung away from the best location discovered so far. This is accomplished by adding to the velocity

(2– 1 ) * Vmax
1 + e-2/d

where Vmax is the maximum velocity, and d is the euclidean distance from the particle to global best position. There is no way to determine whether the global best is a local or the global optimum, so this consideration is coded into the scale of the boost given to each particle. This comes in the form of the euclidean distance to the global best, which is a measure of how close the algorithm is to converging. Thus, in the early stages when the particles are dispersed, if the global best becomes stagnant, the particles are pushed far away to encourage more searching. In the later stages when the particles have all clustered together, this is only a minor push, and will not slow down the convergence in a significant way.

The added push of EPSO will help prevent the algorithm from becoming caught in local optima. Thus, it can be useful in many applications where the functions to be optimized are highly nonlinear. For example, an adaptive infinite impulse response (IIR) filter for an echo canceler can have many coefficients which means that we are likely to have many local optima, and EPSO can more reliably find the absolute optimal solution.