用户名: 密码: 验证码:
源信号自适应的独立成分分析算法应用与研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
独立成分分析是信号处理领域在20世纪90年代所发展起来的一个新兴方向,它是解决盲源分离问题最为有效的方法之一.独立成分分析不仅在语音与图像处理方面有着广泛的应用,而且在心电图(electrocardiographic, ECG)、肌电图(electromyography, EMG)、脑电图(electroencephalograms, EEG)、脑磁图(magnetoencephalograms, MEG)等方面得到广泛应用.本文对独立成分分析算法及其应用进行了一些研究,提出了几种自适应独立成分分析算法,并将其应用到语音与图像去噪中.主要研究结果如下.
     1.在分析了现有ICA算法优缺点的基础上,提出了通过求解梯度方程来解决ICA问题的方法.为求出梯度方程的根,提出了一种牛顿迭代算法,该算法不需要设置学习速率,结构简单,仅需求解一个矩阵方程就可通过迭代方法来得到梯度方程的根.为使算法对源信号具有自适应特性,我们使用非参数方法来估计源信号的统计特征,包括概率密度函数以及其一阶二阶导数.
     2.为克服标准核密度估计方法对于大样本估计问题运算量大的缺点,提出了改进的核密度方法对源信号的概率密度及其一二阶导数进行自适应估计.该方法将源信号的直方图作为桥梁,直接估计出核函数的各个参数,使得当样本量较大时的算法速度得到了较大提高.
     3.一般情况下, ICA中的解混合矩阵所在的参数空间并非欧式空间,而是黎曼空间,因此,传统梯度所得到的方向并非最速方向.为得到黎曼空间中的最速方向,我们从自然梯度角度(或相对梯度角度)出发,使用李群不变性这一准则,得到两种形式的自然梯度(或相对梯度),并以此为基础,得到估计方程,通过求解估计方程,并结合不同的自适应概率密度估计方法,得到自适应ICA算法.由于该方法采用自然梯度来获得最速方向,因此,算法具有superefficiency特性,能达到Fisher有效性.
     4.提出一种自适应定点ICA算法,该方法引入白化预处理,使解混合阵具有正交约束,在此约束下,每次学习后得到的解混合阵必须经过重正交化,使其满足正交约束.该算法计算复杂度小,并且收敛速度快.
     5.将独立成分分析算法应用到语音与图像去噪中.通过ICA算法得到含噪信号的独立分量,并在独立分量域中使用收缩算法对其进行去噪处理,然后经反变换得到去噪的声音或图像信号.
Independent component analysis (ICA) is a new direction in the signal pro-cessing fields which was developed in the 1990’s, and it is one of the most ef-ficient method for solving blind source separation (BSS) problems. ICA is notonly widely applied to the fields of sound signal processing and image process-ing, it has also demonstrated to be successful in various fields such as electro-cardiographic (ECG) processing, electromyography (EMG) processing, magneto-encephalograms (MEG) processing. This dissertation is devoted to the study ofICA algorithm and its application. Several adaptive ICA algorithms are proposed.At the same time, ICA is applied to denoise the sound signal and image data. Themain contributions of the dissertation are as follows.
     1. After the analysis of the advantage and disadvantage of some traditionalICA algorithms, we propose an ICA algorithm through solving the gradi-ent equation. To solve the gradient equation, an iterative method based onNewton’s method is proposed where no learning rate is needed. Meanwhile,this method is very simple because the iterative equation can be obtained bymeans of solving a linear equation, which makes the algorithm very ease touse. At the same time, nonparametric density method is used to estimate theprobability density functions of the sources as well as their first and secondderivatives, which makes the algorithm adaptive to the source distributions.
     2. A modified kernel density method is proposed to overcome the high com-putation cost of the standard kernel density method especially when thedata size is very large. This modified kernel density method utilize the his-tograms of the sources to directly estimated the parameters of the kernel func-tion. Thus the computation speed is faster than the standard kernel densitymethod especially when the data size is very large.
     3. In general cases, the demixing matrix in ICA is not in the standard Euclideanspace but in the Riemannian space. Thus the tradition gradient direction is not the speedest direction. To obtain the speedest direction in Riemannianspace, we start from the viewpoint of natural gradient (or relative gradient)to derive two forms of natural gradient by means of invariant rule of theLie groups. Two forms of estimating equation are proposed based on thetwo forms of natural gradient. The source adaptive ICA algorithm can beobtained by solving the estimating equation iteratively where the probabilitydensity function is estimated adaptively. The algorithm based on solving theestimating equation has the property of superefficiency which can reach theFisher efficiency since the steepest direction is obtained by natural gradient.
     4. After introducing the whitening pre-propessing, the demixing matrix mustbe orthogonal. Under this constraint, the learned demixing matrix must bere-orthogonized after each step, which leads to a fixed-point ICA algorithm.The computation complexity of the fixed-point algorithm is low, and the con-vergence speed is fast.
     5. The ICA algorithm is applied to denoise the sound signal and image data. Theindependent components of the noisy signal is obtained by ICA algorithmand the noise is removed in the independent component domain, and thedenoised signal is obtained after the inverse transform.
引文
[1]杨行峻郑君理.人工神经网络与盲信号处理.清华大学出版社, 2003.
    [2]杨福生洪波.独立分量分析的原理与应用.清华大学出版社, 2006.
    [3]马建.基于独立成分分析的说话人识别技术研究.硕士论文,电子科技大学,2005.
    [4]权松民.独立成分分析方法在儿童镇静状态下视觉刺激fMRI中的应用.硕士论文,中国医科大学, 2006.
    [5]杨竹青.独立成分分析及在fMRI脑图像序列中的应用.硕士论文,国防科学技术大学, 2002.
    [6]钟明军.独立成分分析算法研究及其在功能核磁共振成像中的应用.硕士论文,大连理工大学, 2004.
    [7]周岩.基于独立分量分析的人脸识别方法的研究.硕士论文,长春理工大学,2007.
    [8]马建仓,牛奕龙,陈海洋.盲信号处理.国防工业出版社, 2006.
    [9] S. Achard, D. Pham, and C. Jutten. Blind source separation in post nonlinearmixtures. In Proc. Int. Workshop on Independent Component Analysis and BlindSignal Separation (ICA2001), pages 295–300, 2001.
    [10] S. Amari. Differential Geometrical Methods in Statistics. Springer-Verlag NewYork, 1985.
    [11] S. Amari. Natural gradient works efficiently in learning. Neural Computation,10(2):251–276, 1998.
    [12] S. Amari. Superefficiency in blind source separation. Signal Processing, IEEETransactions on, 47(4):936–944, 1999.
    [13] S. Amari. Information geometry on hierarchy of probability distributions.Information Theory, IEEE Transactions on, 47(5):1701–1711, 2001. 0018-9448.
    [14] S. Amari and J.-F. Cardoso. Blind source separation—semiparametric sta-tistical approach. Signal Processing, IEEE Transactions on, 45(11):2692–2700,1997.
    [15] S. Amari and M. Kawanabe. Differential geometry of estimating functionsin semiparametric statistical models. Technical report, Dept. of Math. Eng.,Univ. of Tokyo, 1993.
    [16] S. Amari and M. Kawanabe. Information geometry of estimating functionsin semi-parametric statistical models. Bernoulli, 3(1):29–54, 1997.
    [17] S. Amari, A. Cichocki, and H. H. Yang. A new learning algorithm for blindsignal separation. In David S. Touretzky, Michael C. Mozer, and Michael E.Hasselmo, editors, Advances in Neural Information Processing Systems, volume8, pages 757–763. MIT Press, 1996.
    [18] S. Amari, T. Chen, and A. Cichocki. Stability analysis of learning algorithmsfor blind source separation. Neural Networks, 10(9):1345–1351, 1997.
    [19] S. Amari, A. Cichocki, and H. H. Yang. Novel online adaptive learning al-gorithms for blind deconvolution using the natural gradient approach. InProc. IFAC Symp. Syst. Identification, pages 1055–1062, Kitakyushu-shi, 1997.
    [20] B. Ans, J. Herault, and C. Jutten. Adaptive neural architectures: detection ofprimitives. In Proc. of COGNITIVA’85, pages 593–597, 1985.
    [21] H. Attias. Independent Factor Analysis. Neural Computation, 11(4):803–851,1999.
    [22] F. R. Bach. The Kernel ICA MATLAB package. URL http://www.di.ens.fr/?fbach/kernel-ica/kernel-ica1_2.tar.gz.
    [23] F. R. Bach and M. I. Jordan. Kernel Independent Component Analysis. Jour-nal of Machine Learning Research, 3(1):1–48, 2002.
    [24] J. Barness, Y. Carlin, and M. Steinberger. Bootstrapping adaptive interfer-ence cancelers: some practical limitations. In Proc. The Globecom. Conference,pages 1251–1255, 1982.
    [25] M. S. Bartlett, J. R. Movellan, and T. J. Sejnowski. Face recognition by inde-pendent component analysis. Neural Networks, IEEE Transactions on, 13(6):1450–1464, 2002.
    [26] A. J. Bell and T. J. Sejnowski. An information-maximization approach toblind source separation and blind deconvolution. Neural Computation, 7(6):1129–1159, 1995.
    [27] A.J. Bell and T.J. Sejnowski. The independent components of natural scenesare edge filters. Vision Research, 37(23):3327–3338, 1997.
    [28] A. Belouchrani, K. Abed-Meraim, J.-F. Cardoso, and E. Moulines. A blindsource separation technique using second-order statistics. IEEE Transactionson Signal Processing, 45(2):434–444, 1997.
    [29] C. Bishop, M. Svensen, and C. Williams. Gtm: The generative topographicmapping. Neural Computation, 10:215–234, 1998.
    [30] P. Bofill and M. Zibulevsky. Underdetermined blind source separation usingsparse representations. Signal Processing, 81(11):2353–2362, 2001.
    [31] R. Boscolo. The Nonparametric MATLAB package. URL http://www.ee.ucla.edu/?riccardo/ICA/.
    [32] R. Boscolo, H. Pan, and V. P. Roychowdhury. Independent component anal-ysis based on nonparametric density estimation. Neural Networks, IEEETransactions on, 15(1):55–65, 2004.
    [33] J. Cao and N. Murata. A stable and robust ICA algorithm based on t-distribution and generalized Gaussian distribution models. In Neural Net-works for Signal Processing IX, 1999. Proceedings of the 1999 IEEE Signal Pro-cessing Society Workshop, pages 283–292, 1999.
    [34] J.-F. Cardoso. Blind identification of independent signals. In Workshop onHigher-order Spectral Analysis, Vail(Co), USA, 1989.
    [35] J.-F. Cardoso. Source separation using higher order moments. In Proc.ICASSP-89, volume 4, pages 2109–2112, Glasgow, Scotland, May 1989.
    [36] J.-F. Cardoso. Eigen-structure of the fourth-order cumulant tensor with ap-plication to the blind source separation problem. In Proc. ICASSP-90, pages2655–2658, 1990.
    [37] J.-F. Cardoso. Iterative techniques for blind source separation using onlyfourth-order cumulants. In Proc. EUSIPCO 92, volume 2, pages 739–742,Brussels, Belgium, Aug 1992.
    [38] J.-F. Cardoso. Infomax and maximum likelihood for source separation. Sig-nal Processing Letters, IEEE, 4(4):112–114, 1997.
    [39] J.-F. Cardoso. High-Order Contrasts for Independent Component Analysis.Neural Computation, 11(1):157–192, 1999.
    [40] J.-F. Cardoso. The JADE MATLAB package. URL http://www.tsi.enst.fr/icacentral/Algos/cardoso/JnS.tar.
    [41] J.-F. Cardoso and P. Comon. Independent component analysis, a survey ofsome algebraic methods. In Circuits and Systems, 1996. ISCAS’96.,’Connect-ing the World’., 1996 IEEE International Symposium on, volume 2, pages 93–96,1996.
    [42] J.-F. Cardoso and B. Laheld. Equivariant adaptive source separation. SignalProcessing, IEEE Transactions on, 44(12):3017–3030, 1996.
    [43] J.-F. Cardoso and T. Paris. Fourth-order cumulant structure forcing: applica-tion to blind array processing. In Statistical Signal and Array Processing, 1992.Conference Proceedings., IEEE Sixth SP Workshop on, pages 136–139, 1992.
    [44] J.-F. Cardoso and A. Souloumiac. Blind beamforming for non-gaussian sig-nals. Radar and Signal Processing, IEE Proceedings F, 140(6):362–370, 1993.
    [45] J.-F. Cardoso and A. Souloumiac. Jacobi angles for simultaneous diagonal-ization. SIAM J. Mat. Anal. Appl., 17(1):161–164, January 1996.
    [46] H. Choi and S. Choi. A relative trust-region algorithm for independent com-ponent analysis. Neurocomputing, 70(7–9):1502–1510, 2007.
    [47] S. Choi, A. Cichocki, and S. Amari. Flexible independent component analy-sis. In Neural Networks for Signal Processing VIII, 1998, Proceedings of the 1998IEEE Signal Processing Society Workshop, pages 83–92, 1998.
    [48] A. Cichocki and S. Amari. Adaptive Blind Signal and Image Processing: learningalgorithms and applications. John Wiley and Sons, 2002.
    [49] A. Cichocki and L. Moszczynski. New learning algorithm for blind separa-tion of sources. Electronics Letters, 28(21):1986–1987, 1992.
    [50] A. Cichocki and R. Unbehauen. Robust estimation of principal compo-nents by using neural-network learning algorithms. Electronics Letters, 29(21):1869–1870, 1993.
    [51] A. Cichocki and R. Unbehauen. Robust neural networks with on-line learn-ing for blind identification and blind separation of sources. IEEE Transactionson Circuits and Systems I-Fundamental Theory and Applications, 43(11):894–906,1996.
    [52] A. Cichocki, S Amari, K. Siwek, et al. The ICALAB Toolboxes. URL http://www.bsp.brain.riken.jp/ICALAB/.
    [53] A. Cichocki, R. Unbehauen, and E. Rummert. Robust learning algorithm forblind separation of signals. Electronics Letters, 30(17):1386–1387, 1994.
    [54] A. Cichocki, S. Amari, and R. Thawonmas. Blind signal extraction usingself-adaptive non-linear hebbian learning rule. In Proc. of Int. Symposium onNonlinear Theory and its Applications, NOLTA-96, pages 377–380, 1996.
    [55] A. Cichocki, R. Thawonmas, and S. Amari. Sequential blind signal extrac-tion in order specified by stochastic properties. Electronics Letters, 33(1):64–65, 1997.
    [56] P. Comon. Separation of Stochastic Processes. In Proc. Workshop on Higher-Order Spectral Analysis, pages 174–179, 1989.
    [57] P. Comon. Independent component analysis. In International Workshop onHigh Order Statistics, pages 111–120, 1991.
    [58] P. Comon. Independent component analysis, a new concept? Signal Process-ing, 36(3):287–314, 1994.
    [59] P. Comon. Blind identification and source separation in 2 times 3 under-determined mixtures. Signal Processing, IEEE Transactions on, 52(1):11–22,2004.
    [60] P. Comon and M. Rajih. Blind identification of under-determined mixturesbased on the characteristic function. Signal Processing, 86(9):2271–2281, 2006.
    [61] P. Comon, C. Jutten, and J. Herault. Blind separation of sources, part II:problems statement. Signal Processing, 24(1):11–20, 1991.
    [62] T. M. Cover and J. A. Thomas. Elements of Information Theory. Wiley-Interscience New York, 2006.
    [63] L. De Lathauwer, B. De Moor, and J. Vandewalle. Independent componentanalysis based on higher-order statistics only. In Proc. IEEE SSAP workshop,Corfu, pages 356–359, 1996.
    [64] L. De Lathauwer, P. Comon, B. De Moor, and J. Vandewalle. ICA algorithmsfor 3 sources and 2 sensors. In Proc. 6th Signal Processing Workshop on Higher-Order Statistics, pages 116–120, 1999.
    [65] L. De Lathauwer, B. De Moor, and J. Vandewalle. An algebraic ICA algo-rithm for 3 sources and 2 sensors. In Proc. 10th Eur. Signal Processing Conf.(EUSIPCO 2000), 2000.
    [66] L. De Lathauwer, B. De Moor, and J. Vandewalle. Independent componentanalysis and (simultaneous) third-order tensor diagonalization. Signal Pro-cessing, IEEE Transactions on, 49(10):2262–2271, 2001.
    [67] L. De Lathauwer, J. Castaing, and J.-F. Cardoso. Fourth-order cumulantbased blind identification of underdetermined mixtures. IEEE Trans. SignalProcess, 55(6):2965–2973, 2007.
    [68] G. Deco and W. Brauer. Nonlinear higher-order statistical decorrelationby volume-conserving neural architectures. Neural Networks, 8(4):525–535,1995.
    [69] G. Deco and D. Obradovic. An Information-Theoretic Approach to Neural Com-puting. Springer, 1996.
    [70] N. Delfosse and P. Loubaton. Adaptive blind separation of independentsources: A de?ation approach. Signal Processing, 45(1):59–83, 1995.
    [71] D. L. Donoho, I. M. Johnstone, G. Kerkyacharian, and D. Picard. Waveletshrinkage: asymptopia? Journal of Royal Statistical Society Series B, 57:301–337, 1995.
    [72] R.O. Duda, P.E. Hart, and D.G. Stork. Pattern Classification. Wiley-Interscience, 2000.
    [73] D. Erdogmus, L. Vielva, and J. Principe. Nonparametric estimation andtracking of the mixing matrix for underdetermined blind source separation.In Proc. 3rd Int. Workshop Independent Component Analysis and Blind SignalSeparation (ICA2001), pages 189–194.
    [74] J. Eriksson and V. Koivunen. Blind identifiability of class of nonlinear instan-taneous ICA models. In Proc. of the XI European Signal Proc. Conf.(EUSIPCO2002, volume 2, pages 7–10.
    [75] J. Eriksson, J. Karvanen, and V. Koivunen. The EGLD ICA MATLAB pack-age, . URL http://wooster.hut.fi/statsp/papers/EGLD_ICA.zip.
    [76] J. Eriksson, J. Karvanen, and V. Koivunen. The Pearson ICA MATLAB pack-age, . URL http://signal.hut.fi/statsp/papers/Pearson_ICA.zip.
    [77] A. Ferre′ol, L. Albera, and P. Chevalier. Fourth-order blind identification ofunderdetermined mixtures of sources (FOBIUM). Signal Processing, IEEETransactions on, 53(5):1640–1653, 2005.
    [78] D. J. Field. What is the goal of sensory coding? Neural Computation, 6(4):559–601, 1994.
    [79] S. Fiori. Blind signal processing by the adaptive activation function neurons.Neural Networks, 13(6):597–611, 2000.
    [80] P. Fo¨ldia′k. Forming sparse representations by local anti-Hebbian learning.Biological Cybernetics, 64(2):165–170, 1990.
    [81] M. Gaeta and J.-L. Lacoume. Source separation without prior knowledge themaximum likelihood solution. In Proc. EUSIPCO’90, pages 621–624, 1990.
    [82] H. Ga¨vert, J. Hurri, J. Sa¨rela¨, and A. Hyva¨rinen. The FastICA MATLABpackage. URL http://www.cis.hut.fi/projects/ica/fastica/.
    [83] X. Giannakopoulos, J. Karhunen, and E. Oja. An Experimental Compari-son of Neural Algorithms for Independent Component Analysis and BlindSeparation. International Journal of Neural Systems, 9(2):651–656, 1999.
    [84] M. Girolami. An Alternative Perspective on Adaptive Independent Compo-nent Analysis Algorithms. Neural Computation, 10(8):2103–2114, 1998.
    [85] M. Girolami. Advances in Independent Component Analysis. Springer-Verlag,2000.
    [86] R. M. Gray. Entropy and information theory. Springer-Verlag New York, 1990.
    [87] M. Haritopoulos, H. Yin, and N. Allison. Image denoising using self-organizing map-based nonlinear independent component analysis. NeuralNetworks, 15:1085–1098, 2002.
    [88] G. Harpur and R Prager. Experiments with low-entropy neural networks.In R. Baddeley, P. Hancock, and R. Fo¨ldia′k, editors, Information Theory andthe Brain, pages 84–100. Cambridge University Press, 2000.
    [89] Z. He, S. Xie, L. Zhang, and A. Cichocki. A Note on Lewicki-SejnowskiGradient for Learning Overcomplete Representations. Neural Computation,20(3):636–643, 2008.
    [90] J. Herault and C. Jutten. Space or time adaptive signal processing by neuralnetwork models. In AIP Conference Proceedings, pages 206–211, 1986.
    [91] P. Hojen-Sorensen, O. Winther, and L. K. Hansen. Mean-Field Approaches toIndependent Component Analysis. Neural Computation, 14(4):889–918, 2002.
    [92] A. Hyva¨rinen. Fast and robust fixed-point algorithms for independent com-ponent analysis. Neural Networks, IEEE Transactions on, 10(3):626–634, 1999.
    [93] A. Hyva¨rinen. Sparse Code Shrinkage: Denoising of Nongaussian Databy Maximum Likelihood Estimation. Neural Computation, 11(7):1739–1768,1999.
    [94] A. Hyva¨rinen and E. Oja. A fast fixed-point algorithm for independent com-ponent analysis. Neural Computation, 9(7):1483–1492, 1997.
    [95] A. Hyva¨rinen and E. Oja. Independent component analysis by general non-linear Hebbian-like learning rules. Signal Processing, 64(3):301–313, 1998.
    [96] A. Hyva¨rinen and P. Pajunen. Nonlinear independent component analysis:Existence and uniqueness results. Neural Networks, 12(3):429–439, 1999.
    [97] A. Hyva¨rinen, J. Karhunen, and E. Oja. Independent component analysis. JohnWiley and Sons, 2001.
    [98] C. Jutten and J. Herault. Blind separation of sources, part I: an adaptivealgorithm based on neuromimetic architecture. Signal Processing, 24(1):1–10,1991.
    [99] J. Karhunen. Neural approaches to independent component analysis andsource separation. In Proceedings of the 4th European Symposium on ArtificialNeural Networks (ESANN-96), pages 249–266, Bruges, Belgium, 1996.
    [100] J. Karhunen and J. Joutsensalo. Representation and separation of signalsusing nonlinear PCA type learning. Neural Networks, 7(1):113–127, 1994.
    [101] J. Karhunen and J. Joutsensalo. Representation and separation of signalsusing nonlinear PCA type learning. Neural Networks, 7(1):113–127, 1994.
    [102] J. Karhunen and J. Joutsensalo. Generalizations of principal componentanalysis, optimization problems, and neural networks. Neural Networks, 8(4):549–562, 1995.
    [103] J. Karhunen, E. Oja, L. Wang, R. Vigario, and J. Joutsensalo. A class of neu-ral networks for independent component analysis. Neural Networks, IEEETransactions on, 8(3):486–504, 1997.
    [104] J. Karhunen, P. Pajunen, and E. Oja. The nonlinear PCA criterion in blindsource separation: Relations with other approaches. Neurocomputing, 22(1-3):5–20, 1998.
    [105] J. Karvanen and V. Koivunen. Blind separation methods based on Pearsonsystem and its extensions. Signal Processing, 82(4):663–673, 2002.
    [106] J. Karvanen, J. Eriksson, and V. Koivunen. Adaptive score functions formaximum likelihood ICA. Journal of VLSI Signal Processing, 32(1–2):83–92,2002.
    [107] M. G. Kendall and A. Stuart. The Advanced Theory of Statistics. Volume 1:Distribution theory. Charles Griffin & Company Limited, 3rd edition, 1969.
    [108] M. G. Kendall and A. Stuart. The advanced theory of statistics. Volume 3: Designand analysis, and time-series. Charles Griffin & Company Limited, 3rd edition,1976.
    [109] M. G. Kendall, A. Stuart, and J. K. Ord. The advanced theory of statistics. Vol-ume 2: Classical inference and relationship. Charles Griffin & Company Lim-ited, 3rd edition, 1976.
    [110] Z. Koldovsky′and P. Tichavsky′. The EFICA MATLAB package. URL http://itakura.kes.tul.cz/zbynek/efica.htm.
    [111] Z. Koldovsky′, P. Tichavsky′, and E. Oja. Efficient variant of algorithm Fast-ICA for independent component analysis attaining the Cramer-Rao lowerbound. Neural Networks, IEEE Transactions on, 17(5):1265–1277, 2006.
    [112] M. Korenberg and I. Hunter. The identification of nonlinear biological sys-tems: LNL cascade models. Biological Cybernetics, 43(12):125–134, 1995.
    [113] Keun-Chang Kwak and W. Pedrycz. Face recognition using an enhancedindependent component analysis approach. Neural Networks, IEEE Transac-tions on, 18(2):530–541, 2007.
    [114] B. Laheld and J.-F. Cardoso. Adaptive source separation with uniform per-formance. In Proc. EUSIPCO, pages 183–186, 1994.
    [115] H. Lappalainen and A. Honkela. Bayesian nonlinear independent compo-nent analysis by multi-layer perceptrons. In M. Girolami, editor, Advancesin Independent Component Analysis, pages 93–121. Springer-Verlag, 2000.
    [116] E. G. Learned-Miller and J. W. Fisher III. ICA using spacings estimates ofentropy. The Journal of Machine Learning Research, 4:1271–1295, 2003.
    [117] T. Lee. Independent component analysis using an extended infomax algo-rithm for mixed subgaussian and supergaussian sources. Neural Computa-tion, 11(2):417–441, 1999.
    [118] T. Lee, B. Koehler, and R. Orglmeister. Blind source separation of nonlinearmixing models. In Neural Networks for Signal Processing [1997] VII. Proceed-ings of the 1997 IEEE Workshop, pages 406–415, 1997.
    [119] T.-W. Lee. Independent component analysis: Theory and applications. Boston,Mass: Kluwer Academic Publishers, 1998.
    [120] M. S. Lewicki and T. J. Sejnowski. Learning Overcomplete Representations,2000.
    [121] Y. Li, A. Cichocki, and S. Amari. Analysis of Sparse Representation andBlind Source Separation. Neural Computation, 16(6):1193–1234, 2004.
    [122] Y. Li, S. Amari, A. Cichocki, DWC Ho, and S. Xie. Underdetermined blindsource separation based on sparse representation. Signal Processing, IEEETransactions on [see also Acoustics, Speech, and Signal Processing, IEEE Transac-tions on], 54(2):423–437, 2006.
    [123] Z. Malouche and O. Macchi. Adaptive unsupervised extraction of one com-ponent of a linearmixture with a single neuron. Neural Networks, IEEE Trans-actions on, 9(1):123–138, 1998.
    [124] G. Marques and L. Almeida. Separation of nonlinear mixtures using patternrepulsion. In Proc. Int. Workshop on Independent Component Analysis and SignalSeparation (ICA’99), pages 277–282, 1999.
    [125] J. Miskin. Ensemble learning for independent component analysis. InProc. Int. Workshop on Independent Component Analysis and Signal Separation(ICA’99), pages 7–12, 1999.
    [126] D. Obradovic and G. Deco. Information maximization and independentcomponent analysis is their a difference. Neural Computation, 10(8):2085–2101, 1998.
    [127] E. Oja. The nonlinear PCA learning rule in independent component analy-sis. Neurocomputing, 17(1):25–45, 1997.
    [128] E. Oja. Nonlinear pca criterion and maximum likelihood in independentcomponent analysis. In Proc. Int. Workshop on Independent Component Analy-sis and Signal Separation(ICA’99), pages 143–148, 1999.
    [129] E. Oja, H. Ogawa, and J. Wangviwattana. Learning in nonlinear con-strained Hebbian networks. In Proc. Int. Conf. on Artificial Neural Net-works(ICANN’91), pages 385–390, 1991.
    [130] B. A. Olshausen and D. J. Field. Emergence of simple-cell receptive fieldproperties by learning a sparse code for natural images. Nature, 381(6583):607–609, 1996.
    [131] B. A. Olshausen and D. J. Field. Sparse coding with an overcomplete basisset: A strategy employed by V1? Vision Research, 37(23):3311–3325, 1997.
    [132] M. Opper and O. Winther. Gaussian Processes for Classification: Mean-FieldAlgorithms. Neural Computation, 12(11):2655–2684, 2000.
    [133] P. Pajunen. Nonlinear independent component analysis by selforganizingmaps. In Int. Conf. on Artificial Neural Networks (ICANN’96), pages 815–820,1996.
    [134] P. Pajunen and J. Karhunen. Least-squares methods for blind source sepa-ration based on nonlinear PCA. International Journal Of Neural Systems, 8:601–612, 1997.
    [135] P. Pajunen, A. Hyva¨rinen, and J. Karhunen. Nonlinear blind source sep-aration by self-organizing maps. In Proc. Int. Conf. on Neural InformationProcessing, pages 1207–1210, 1996.
    [136] A. Papoulis. Probability, random variables, and stochastic processes. McGraw-Hill New York, 1991.
    [137] A. Parashiv-Ionescu, C. Jutten, A. Ionescu, A. Chovet, and A. Rusu. Highperformance magnetic field smart sensor arrays with source separation. InProc. of the 1st Int. Conf. on Modeling and Simulation of Microsystems, MSM 98,pages 666–671, 1998.
    [138] E. Pazen. On estimation of a probability density function and mode. Ann.Math. Statist., 33:1065–1076, 1962.
    [139] B. A. Pearlmutter and L. C. Parra. A context-sensitive generalization ofICA. In International Conference on Neural Information Processing, volume 151.Hongkong: MIT Press, 1996.
    [140] H. Peng, Z. Chi, and W. Siu. A semi-parametric hybrid neural model fornonlinear blind signal separation. Int. J. of Neural Systems, 10(2):79–94, 2000.
    [141] K. B. Petersen. Mean Field ICA. PhD thesis, Technical University of Denmark,2005.
    [142] D. Pham. Blind separation of instantaneous mixture of sources via an inde-pendent component analysis. Signal Processing, IEEE Transactions on, 44(11):2768–2779, 1996.
    [143] D. Pham. Mutual information approach to blind separation of stationarysources. In Proceedings of ICA’99, pages 215–220, Aussois, France, 1999.
    [144] D. Pham. Blind separation of instantaneous mixture of sources based onorder statistics. Signal Processing, IEEE Transactions on, 48(2):363–375, 2000.
    [145] D. Pham. Blind separation of instantaneous mixture of sources via an inde-pendent component analysis. Signal Processing, IEEE Transactions on, 44(11):2768–2779, 1996.
    [146] D. Pham and P. Garat. Blind separation of mixture of independent sourcesthrough a quasi-maximum likelihood approach. Signal Processing, IEEETransactions on, 45(7):1712–1725, 1997.
    [147] D. Pham, P. Garrat, and C. Jutten. Separation of a mixture of independentsources through a maximum likelihood approach. In Proc. EUSIPCO, pages771–774, Brussels, Belgium, 1992.
    [148] S. Prakriya and D. Hatzinakos. Blind identification of LTI-ZMNL-LTI non-linear channel models. Signal Processing, IEEE Transactions on, 43(12):3007–3013, 1995.
    [149] J. C. Principe and D. Xu. Information theoretic learning using Renyi’squadratic entropy. In Proceedings of ICA’99, pages 407–412, 1999.
    [150] C. Puntonet, M. Alvarez, A. Prieto, and B. Prieto. Separation of sources ina class of post-nonlinear mixtures. In Proc. 6th European Symp. on ArtificialNeural Networks (ESANN’98), pages 321–326.
    [151] S. Roberts and R. Everson. Independent Component Analysis: Principles andPractice. Cambridge Univ. Press, 2001.
    [152] F. Rojas, I. Rojas, R. Clemente, and C. Puntonet. Nonlinear blind source sep-aration using genetic algorithms. In Proc. Int. Conf. on Independent ComponentAnalysis and Signal Separation (ICA2001), pages 400–405, 2001.
    [153] K. Sengupta, P. Burman, and R. Sharma. A non-parametric approach forindependent component analysis using kernel density estimation. In IEEEComputer Society Conference on Computer Vision and Pattern Recognition, vol-ume 2, pages 667–672, 2004.
    [154] Z. Shi, H. Tang, and Y. Tang. A new fixed-point algorithm for independentcomponent analysis. Neurocomputing, 56:467–473, 2004.
    [155] B. Silverman. Kernel density estimation using the fast Fourier transform.Applied Statistics, 31:93–99, 1982.
    [156] B. Silverman. Density Estimation for Statistics and Data Analysis. London:Chapman and Hall, 1986.
    [157] M. Solazzi, R. Parisi, and A. Uncini. Blind source separation in nonlinearmixtures by adaptive spline neural networks. In Proc. Int. Conf. on Inde-pendent Component Analysis and Signal Separation (ICA2001), pages 254–259,2001.
    [158] J. Sole′, M. Babaie-Zadeh, C. Jutten, and D. Pham. Improving algorithmspeed in PNL mixture separation and Wiener system inversion. In Proc. 4thIntern. Symp. on Independent Component Analysis and Blind Signal Separation(ICA2003), pages 639–644, 2003.
    [159] V. C. Soon, L. Tong, Y. F. Huang, and R. Liu. An extended fourth order blindidentification algorithm inspatially correlated noise. In ICASSP-90, pages1365–1368, 1990.
    [160] F. Sorouchyari. Blind separation of sources, part III: stability analysis. SignalProcessing, 24(1):21–29, 1991.
    [161] J.V. Stone. Independent Component Analysis: a tutorial introduction. BradfordBooks, 2004.
    [162] A. Taleb. An algorithm for the blind identification of n independent signalswith 2 sensors. In Proc. 16th Int. Symp. on Signal Processing and Its Applications(ISSPA’01), pages 5–8, 2001.
    [163] A. Taleb. A generic framework for blind source separation in structurednonlinear models. Signal Processing, IEEE Transactions on, 50(8):1819–1830,2002.
    [164] A. Taleb and C. Jutten. Nonlinear source separation: The postlinear mix-tures. In Proc. Europ. Symp. on Artificial Neural Networks (ESANN’97), pages279–284, 1997.
    [165] A. Taleb and C. Jutten. Batch algorithm for source separation in post-nonlinear mixtures. In Proc. First Int. Workshop on Independent ComponentAnalysis and Signal Separation (ICA’99), pages 155–160, 1999.
    [166] A. Taleb and C. Jutten. Source separation in post-nonlinear mixtures. SignalProcessing, IEEE Transactions on, 47(10):2807–2820, 1999.
    [167] A. Taleb, C. Jutten, and G. LIS-INPG. On underdetermined source separa-tion. In ICASSP’99, volume 3, 1999.
    [168] Y. Tan, J. Wang, and J.M. Zurada. Nonlinear Blind Source Separation Usinga Radial Basis Function Network. IEEE Trans. on Neural Networks, 12(1):124–134, 2001.
    [169] F. Theis, C. Bauer, C. Puntonet, and E. Lang. Pattern repulsion revis-ited. In Proc. Int. Work-Conference on Artificial and Natural Neural Networks(IWANN2001), pages 778–785, 2001.
    [170] F.J. Theis, E.W. Lang, and C.G. Puntonet. A geometric algorithm for over-complete linear ICA. Neurocomputing, 56:381–398, 2004.
    [171] P. Tichavsky′, Z. Koldovsky′, and E. Oja. Performance analysis of the Fast-ICA algorithm and Cramer-rao bounds for linear independent componentanalysis. Signal Processing, IEEE Transactions on, 54(4):1189–1203, 2006.
    [172] L. Tong, V. C. Soon, Y. F. Huang, and R. Liu. Multiple source signal sep-aration in noise. In Proc. 27th Annual Allerton Conf. on Comm., Contr. andComputing, Urbana, IL, Oct. 1989.
    [173] H. Valpola and J. Karhunen. An unsupervised ensemble learning methodfor nonlinear dynamic state-space models. Neural Computation, 14(11):2647–2692, 2002.
    [174] H. Valpola, X. Giannakopoulos, A. Honkela, and J. Karhunen. Nonlinear in-dependent component analysis using ensemble learning: Experiments anddiscussion. In Proc. 2nd Int. Workshop on Independent Component Analysis andBlind Signal Separation, pages 351–356, Helsinki, Finland, 2000.
    [175] H. Valpola, A. Honkela, and J. Karhunen. An ensemble learning approachto nonlinear dynamic blind source separation using state-space models. InProc. Int. Joint Conf. on Neural Networks (IJCNN’02), pages 460–465, 2002.
    [176] H. Valpola, E. Oja, A. Ilin, A. Honkela, and J. Karhunen. Nonlinear blindsource separation by variational Bayesian learning. IEICE Transactions onFundamentals of Electronics, Communications and Computer Sciences, 86:532–541, 2003.
    [177] N. Vlassis and Y. Motomura. Efficient source adaptivity in independentcomponent analysis. Neural Networks, IEEE Transactions on, 12(3):559–566,2001.
    [178] M. Welling and M. Weber. A Constrained EM Algorithm for IndependentComponent Analysis. Neural Computation, 13(3):677–689, 2001.
    [179] Z. Xiong and T. Huang. Nonlinear independent component analysis (ica)using power series and application to blind source separation. In Proc. 3rdInt. Conf. on Independent Component Analysis and Signal Separation (ICA2001),pages 680–685, 2001.
    [180] L. Xu, C. Cheung, and S. Amari. Learned parametric mixture based ICAalgorithm. Neurocomputing, 22(1–3):69–80, 1998.
    [181] L. Zhang, A. Cichocki, and S. Amari. Self-adaptive blind source separationbased on activation functions adaptation. Neural Networks, IEEE Transactionson, 15(2):233–244, 2004.
    [182] M. Zibulevsky. The Relative Newton ICA MATLAB package. URL http://iew3.technion.ac.il/?mcib/relnwt021203n.zip.
    [183] A. Ziehe, M. Kawanabe, S. Harmeling, and K. Muller. Separation ofpost-nonlinear mixtures using ACE and temporal decorrelation. In Proc.Int. Workshop on Independent Component Analysis and Blind Signal Separation(ICA2001), pages 433–438, 2001.
    [184] A. Ziehe, M. Kawanabe, S. Harmeling, and K.-R. Mu¨ller. Blind separation ofpost-nonlinear mixtures using gaussianizing transformations and temporaldecorrelation. In Proc. 4th Intern. Symp. on Independent Component Analysisand Blind Signal Separation (ICA2003), pages 269–274, 2003.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700