TY - JOUR
T1 - Modified Herault-Jutten Algorithms for Blind Separation of Sources
AU - Cichocki, Andrzej
AU - Bogner, Robert E.
AU - Moszczyński, Leszek
AU - Pope, Kenneth
PY - 1997/4
Y1 - 1997/4
N2 - We present several modifications of blind separation adaptive algorithms which have significant advantages over the well-known Herault-Jutten learning algorithm in handling ill-conditioned signals. In particular, the proposed algorithms are more stable and converge to the correct solutions in cases where previous algorithms did not. The problem is the classical one in which several independent source signals sj(t) (j = 1, 2,. . . , n) are linearly combined via unknown mixing coefficients (parameters) aij to form observations xi(t) = Σnj=1 aijsj(t), i = 1, 2, . . . , n. The synaptic weights wij of a linear system (often referred to as a single-layer feedforward neural network) must be adapted to combine the observations xi(t) to form optimal estimates of the source signals ŝp(t) = yp(t) = Σni=l wpixi(t). The optimal weights correspond to the statistical independence of the output signals yp(t) and they simultaneously ensure self-normalization of these signals. Starting from the modified Herault-Jutten recursive neural network model, we have derived a family of on-line adaptive learning algorithms for feedback (fully recurrent) and feedforward architectures. The validity and high performance of the proposed neural network are illustrated by simulation.
AB - We present several modifications of blind separation adaptive algorithms which have significant advantages over the well-known Herault-Jutten learning algorithm in handling ill-conditioned signals. In particular, the proposed algorithms are more stable and converge to the correct solutions in cases where previous algorithms did not. The problem is the classical one in which several independent source signals sj(t) (j = 1, 2,. . . , n) are linearly combined via unknown mixing coefficients (parameters) aij to form observations xi(t) = Σnj=1 aijsj(t), i = 1, 2, . . . , n. The synaptic weights wij of a linear system (often referred to as a single-layer feedforward neural network) must be adapted to combine the observations xi(t) to form optimal estimates of the source signals ŝp(t) = yp(t) = Σni=l wpixi(t). The optimal weights correspond to the statistical independence of the output signals yp(t) and they simultaneously ensure self-normalization of these signals. Starting from the modified Herault-Jutten recursive neural network model, we have derived a family of on-line adaptive learning algorithms for feedback (fully recurrent) and feedforward architectures. The validity and high performance of the proposed neural network are illustrated by simulation.
UR - http://www.scopus.com/inward/record.url?scp=0031123458&partnerID=8YFLogxK
U2 - 10.1006/dspr.1997.0281
DO - 10.1006/dspr.1997.0281
M3 - Article
AN - SCOPUS:0031123458
SN - 1051-2004
VL - 7
SP - 80
EP - 93
JO - Digital Signal Processing: A Review Journal
JF - Digital Signal Processing: A Review Journal
IS - 2
ER -