We present several modifications of blind separation adaptive algorithms which have significant advantages over the well-known Herault-Jutten learning algorithm in handling ill-conditioned signals. In particular, the proposed algorithms are more stable and converge to the correct solutions in cases where previous algorithms did not. The problem is the classical one in which several independent source signals sj(t) (j = 1, 2,. . . , n) are linearly combined via unknown mixing coefficients (parameters) aij to form observations xi(t) = Σnj=1 aijsj(t), i = 1, 2, . . . , n. The synaptic weights wij of a linear system (often referred to as a single-layer feedforward neural network) must be adapted to combine the observations xi(t) to form optimal estimates of the source signals ŝp(t) = yp(t) = Σni=l wpixi(t). The optimal weights correspond to the statistical independence of the output signals yp(t) and they simultaneously ensure self-normalization of these signals. Starting from the modified Herault-Jutten recursive neural network model, we have derived a family of on-line adaptive learning algorithms for feedback (fully recurrent) and feedforward architectures. The validity and high performance of the proposed neural network are illustrated by simulation.