## Blind Source Separation: High Order Statistics Approach

High order statistics measures are extensions of second order measures such as autocorrelation and power spectrum. The second order statistics measures work fine if the siganls are Gaussian but many real life signals are non-Gaussian and high order measures can be useful.

Skewness the 3rd moments

For univariate data $Y1$, $Y2$, …, $YN$, the skewness is defined as $Skewness=\frac{\sum_{i=1}^{N}\left(Y_i-\bar{Y}\right)^3}{\left(N-1\right)\sigma^3}$

where $\bar{Y}$ and σ are the mean and standard deviation. For any symmetric distribution, the skewness should be zero, therefore, for any Gaussian distribution signal, the skewness is zero.

Kertosis the 4th moments

For univariate data $Y1$, $Y2$, …, $YN$, the kertosis is defined as $Kertosis=\frac{\sum_{i=1}^{N}\left(Y_i-\bar{Y}\right)^4}{\left(N-1\right)\sigma^4}$

The Kertosis of standard normal distribution is 3. If a kertosis is larger than 3, the signal is called superGaussian and smaller than 3, the signal is subGaussian, or leptokurtic or platykurtic.

For a mixing system, $y(t)=Hx(t)\ +\ N(t)$.

where $H$ is the linear mixing matrix and $N(t)$ is the noise term for each channel, we want to exploit the statistical properties of sources through the use of order statistics. Order statistics can be used as non-parametric estimators of the quantile function to measure the distance from the standard Gaussian distribution.

The Measure of Gaussianity

To measure the distance between a zero mean random variable $X$ and the standard Gaussian distribution, we can use $D\left(X\right)=\int_{0}^{1}{\left[Q_X\left(u\right)-Q_N\left(u\right)\right]^2du,}$

where $Q_X\left(u\right)\$ is the quantile function. $X$ is Gaussian if and only if $D(X) = 0$. This measure has been used in many applications to determine Gaussianity.

For implementation, the following approximation can be used instead, $D\left(X\right)=-\frac{1}{N+1}\sum_{r=1}^{T}\left[Q_X\left(\frac{r}{T+1}\right)-Q_N\left(\frac{r}{T+1}\right)\right]^2$

Extracting one source among N signals

To estimate s from the $N$ mixtures, it is necessary to identify $N$ coefficients, $mi$, $s\left(t\right)=\sum_{i=1}^{N}m_ix_i$,

to approximate one of the $N$ sources.

The goal is to make $s(t)$ as far away from Gaussian as possible. This is achieved by minimizing the following, $L\left(s\right)=D(s),$

where $D(s)$ is given above.

The above approach is credited to Donoho for his work on denoising.