用户名: 密码: 验证码:
基于t-混合模型和扩展保局投影的聚类与降维方法研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
模式识别作为一门多领域交叉的学科,在近几十年得到了蓬勃发展。它不仅得到了众多科研人员的热情研究,而且受到了各级政府和组织的重视。世界上许多国家和地区的国防、公共安全部门以及工业界都积极投资模式识别技术的研究。其发展将对科技进步、国防、公共安全、工业制造、人民生活等水平产生深远影响。
     本文以统计理论和图谱方法为基础,重点研究模式识别中的聚类和降维这两方面的内容:(ⅰ)在聚类中,结合图像分割,对有限混合模型的参数估计方法进行了较为深入的研究:(ⅱ)在降维中,结合图像识别,研究了保局投影及其二维和线性混合的扩展。本文研究的主要内容及创新如下:
     研究多元t-密度的有限混合模型的参数估计方法,构建了多元t-混合模型的SMEM算法。t-密度尾部较重,抗噪性能好,是替代高斯密度的标准选择。EM算法是求解混合密度的参数估计的常用算法。而常规的EM算法经常收敛到局部最优而非全局最优。我们采用了把分量进行分裂合并,使参数跳出局部最优来寻找全局最优的思想,构建了多元t-混合模型的SMEM算法,并且提出了一个基于样本均值和方差的分裂合并准则。实验验证了我们的算法性能良好。
     根据局部Kullback散度,构建了多元t-混合模型的贪婪EM算法的框架。从一个混合分量开始,逐个地分裂拟合最差的分量并用EM算法修正各分量的参数。相对于SMEM算法,贪婪EM算法具有它的优势:易于参数初始化、速度快且性能相当、产生的混合模型序列便于模型选择。实验验证了贪婪EM算法的速度快且性能和SMEM算法相当。
     研究保局投影(LPP)及其低样本量时的解法,提出LPP/QR算法。LPP能够保持数据的局部信息且能发现数据流形的内在结构。但对小样本情形,矩阵存在奇异性,LPP无法直接使用。我们提出基于QR分解的保局投影(LPP/QR)
Pattern recognition, as an interdisciplinary subject, has got great development in the past few decades. It has become not only the pursuit of researchers but also the interests of governments and organizations. The National Defenses, departments of public safety and industrial communities of many countries/ regions have invested a large amount of money on the research of pattern recognition techniques. The development of pattern recognition will greatly influence the progress of science and technology, national defence, public safety, industrial manufacture and the life of people.
    Based on the statistics theory and graph spectrum methods, this disserta,-tion mainly investigates these two aspects in pattern recognition — clustering and dimensionality reduction: (i) the in-depth study of estimation techniques of parameters of t-mixture models in clustering, where we take image segmentation into account; and (ii) the research on locality preserving projections and its extension of two-dimensional methods and that of linear mixtures in dimensionality reduction, during which we take image recognition into account. The main contributions of this dissertation are outlined as follows.
    Firstly, we investigate the estimation techniques of parameters of multivari-ate t-density mixtures and construct SMEM algorithm for them, t-density has heavier tail with good property of anti-noising. Modeling mixtures of multivariate t-densities is usually adopted as a standard and robust alternative to Gaussian mixtures. Expectation-maximization (EM) algorithm is a standard algorithm for solving the parameters of mixture models. However, EM often converges to local optimum rather than global one. We take the idea of allowing the parameters to jump out of local optimum and looking for global one by means of splitting
引文
[1] T. Pavlidis. Structural Pattern Recognition[M]. Berlin: Springer-Verlag, 1977
    [2] S.Watanabe. Pattern Recognition: Human and Mechanical[M]. New York: Wiley, 1985
    
    [31 R. P. W. Duin, F. Roli, D. Redder. A note on core research issues for statistical pattern recognition[J]. Pattern Recognition Letters. 2002, 23:493-499
    [4] A. K. Jain, R.P.W. Duin, J. Mao. Statistical pattern recognition: a review [J]. IEEE Trans Pattern Analysis and Machine Intelligence. 2000, 22:4-37
    [5] R, O. Duda, P. E. Hart, D. G. Stork. Pattern Classification[M], second . New York: Wiley, 2001
    [6] A. R. Webb. Statistical Pattern Recognition [M]. New York, USA: Wiley, 2002
    [7] A. K. Jain, M. N. Murty, P. J. Flynn. Data clustering: a review [J]. ACM Computing Surveys. 1999, 31(3):264-323
    [8] J. A. hartigan, M. A. Wong. A k-means clustering algorithm[J]. Applied Statistics. 1979, 28:100-108
    [9] G. J. McLachlan, D. Peel. Finite Mixture Models[M]. New York: Wiley,2000
    [10] S. C. Johnson. Hierarchical clustering schemes [J]. Psychometrika. 1967, 32:241 - 254
    
    [11] C. M. Bishop, M. E. Tipping. A hierarchical latent variable model for data visualization [J]. IEEE Trans Pattern Analysis and Machine Intelligence. 1998, 20:281-293
    
    [12] J. Shi, J. Malik. Normalized cuts and image segmentation [J]. IEEE Trans actions on Pattern Analysis and Machine Intelligence. 2000, (8):888-905
    [13] R. Kannan, S. Vempala, A. Vetta. On Clusterings - Good, Bad and Spec- tral [C]. Annual IEEE Symposium on Foundations of Computer Science (FOCS). 2000, 367-377
    
    [14] A. Y. Ng, M. I. Jordan, Y. Weiss. On Spectral Clustering: Analysis and an Algorithm[C]. Advances in Neural Information Processing Systems. 2002, vol. 14, 849-856
    [15] M. Meila, J. Shi. Learning segmentation by random walks[C]. Neural Information Processing Systems. 2000, 873-879
    [16] I. T. Jolliffe. Principal component analysis[M], second. New York: Springer-Verlag, 2002
    [17] R. Rosipal, M. Girolami, L.J. Trejo, A. Cichocki. Kernel PCA for feature extraction and de-noising in nonlinear regression[J]. Neural Computation & Application. 2001, 10: 231-243
    [18] B. Kegl, A. Krzyzak, T. Linder, K. Zeger. Learning and design of principal curves[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2000, 22(3): 281-297
    [19] P. N. Belhumeur, J. P. Hespanha, D. J. Kriegman. Eigenfaces vs. Fisherfaces: recognition using class specific linear projection[J]. IEEE Trans Pattern Analysis and Machine Intelligence. 1997, 19(7): 711-720
    [20] T. Kohonen. Self-organizing maps[M], 3rd. New York: Springer-Verlag, 2001
    [21] C. M. Bishop, M. Svensen, C. K. I. Williams. GTM: the generative topographic mapping[J]. Neural Computation. 1999, 10: 215-234
    [22] T. F. Cox, M. A. A. Cox. Multidimensional Scaling[M], 2nd. Chapman & Hall, 2001
    [23] J. B. Tenenbaum, V. D. Silva, J. C. Langford. A Global Geometric Frame-work for Nonlinear Dimensionality Reduction[J]. Science. 2000, 290: 2319-2323
    [24] S. T. Roweis, L. K. Saul. Nonlinear dimensionality reduction by locally linear embedding[J]. Science. 2000, 290: 2323-2326
    [25] M. Belkin, P. Niyogi. Laplacian eigenmaps for dimensionality reduction and data representation[J]. Neural Computation. 2003, 15: 1373-1396
    [26] X. F. He, P. Niyogi. Locality Preserving Projections[C]. Proc. Conf. Advances in Neural Information Processing Systems. 2003
    [27] D. D. Lee, H. S. Seung. Learning the parts of objects by non-negative matrix factorization[J]. Nature. 1999, 401: 788-791
    [28] J. Yang, D. Zhang, A. F. Frangi, J. Y. Yang. Two-dimensional PCA: a new approach to appearance-based face representation and recognition[J]. IEEE Trans Pattern Analysis and Machine Intelligence. 2004, 26(1): 131-137
    [29] X. F. He, S. Yan, Y. Hu, P. Niyogi, H. J. Zhang. Face recognition using Laplacianfaces[J]. IEEE Trans Pattern Analysis and Machine Intelligence. 2005, 27(3): 328-340
    [30] K. C. Lee, J. Ho, D.J. Kriegman. Acquiring Linear Subspaces for Face Recognition under Variable Lighting[J]. IEEE Trans Pattern Analysis and Machine Intelligence. 2005, 27(5):684-698
    [31] A. M. Martinez, M. L. Zhu. Where Are Linear Feature Extraction Methods Applicable?[J]. IEEE Trans Pattern Analysis and Machine Intelligence. 2005, 27(12):1934-1944
    [32] P. F. Hsieh, D. S. Wang, C. W. Hsu. A Linear Feature Extraction for Multiclass Classification Problems Based on Class Mean and Covariance Discriminant Information [J]. IEEE Trans Pattern Analysis and Machine Intelligence. 2006, 28(2):223-235
    [33] G. J. McLachlan, K. E. Basford. Mixture Models: Inference and Applications to Clustering[M]. New York: Marcel Dekker, 1988
    [34] D. M. Titterington, A.F.M. Smith, U.E. Makov. Statistical Analysis of Finite .Mixture Distributions[M]. New York: Wiley, 1985
    [35] D. B(?)hning. Computer-Assisted Analysis of Mixtures and Applications: Meta-Analysis, Disease Mapping and Others[M]. New York: Chapman & Hall/CRC, 1999
    [36] B. Ripley. Pattern Recognition and Neural Networks[M]. Cambridge, Mass,1996
    [37] S. Dalai, W. Hall. Approximating Priors by Mixtures of Natureal Conjugate Priors[J]. Journal of the Royal Statistical Society (B). 1983, 45(2):278-286
    [38] P. Pudil, J. Novovicova, J. Kittler. Feature Selection Based on the Ap proximation of Class Densities by Finite Mixtures of the Special Type[J]. Pattern Recognition. 1995, 28(9):1389-1398
    [39] K. Pearson. Contributions to the theory of mathematical evolution[J]. Philosophical Trans Royal Soc London A. 1894, 185:71-110
    [40] A. P. Dempster, N. M. Laird, D. B. Rubin. Maximum likelihood from incomplete data using the EM algorithm (with discussion)[J]. Journal of the Royal Statistical Society (B). 1977, 39(4):l-38
    [41] E. Parzen. On the estimation of a probability density function and mode[J]. The Annals of Mathematical Statistics. 1962, 33:1064-1076
    [42] E. J. Bellegarda, J. R. Bellegarda, D Nahamoo, K. S. Nathan. A Fast Sta tistical Mixture Algorithm For On-Line Handwriting Recognition [J]. IEEE Trans Pattern Analysis and Machine Intelligence. 1994, 16(12):1227-1233
    [43] S. J. Mckenna, S. G. Gong, Y. Raja. Modelling facial colour and identity with Gaussian mixtures [J]. Pattern Recognition. 1998, 31(12):1883-1892
    [44] R. Gross, J. Yang, A. Waibel. Growing Gaussian Mixture Models for Pose Invariant Face Recognition[C]. Intl. Conf. on Pattern Recognition. 2000, vol. 1, 5088-5091
    [45] S. Lucey, T. Chen. A GMM Parts Based Face Representation for Improved Verification through Relevance Adaptation[C]. IEEE Conf. on Computer Vision and Pattern Recognition. 2004, vol. 2, 855-861
    [46] C. Sanderson, M. Saban, Y. S. Gao. On Local Features for GMM Based Face Verification[C]. Proc. 3rd Int. Conf. Information Technology and Applications. 2005, vol. 1, 650-655
    [47] F. Boccardi, C. Drioli. Sound Morphing With Gaussian Mixture Models[C]. Proceedings of the 4th COST G-6 Workshop on Digital Audio Effects (DAFx01). 2001
    [48] P. Somervuo. Speech Modeling Using Variational Bayesian Mixture Of Gaussians[C]. Int'l. Conf. on Spoken Language Processing. 2002, 1245-1248
    [49] D. A. Reynolds, T. F. Quatieri, R. B. Dunn. Speaker verification using adapted Gaussian mixture models[J]. Digital Signal Processing. 2000, 10(1): 19-41
    [50] M. Liu, E. Chang, B. Q. Dai. Hierarchical Gaussian Mixture Model for Speaker Verification[C]. International Conference on Spoken Language Processing. 2002, 1353-1356
    [51] J. Zhang, D. H. Ma. Nonlinear Prediction for Gaussian Mixture Image Models[J]. IEEE Transactions On Image Processing. 2004, 13(6): 836-847
    [52] H. Permuter, J. Francos, I. Jermyn. Gaussian mixture models of texture and colour for image database retrieval[C]. Int'l. Conf. on Acoustics, Speech, and Signal Processing. 2003, vol. 3, 569-572
    [53] S. Jeong, C. S. Won, R. M. Gray. Image retrieval using color histograms generated by Gauss mixture vector quantization[J]. Computer Vision and Image Understanding. 2004, 94: 44-66
    [54] D. S. Lee. Effective Gaussian Mixture Learning for Video Background Subtraction[J]. IEEE Transactions On Pattern Analysis And Machine Intelligence. 2005, 27(5): 827-832
    [55] G. J. McLachlan, T. Krishnan. The EM Algorithm and Extensions[M]. New York: Wiley, 1997
    [56] C.F.J. Wu. On the convergence properties of the EM algorithm[J]. Annals of Statistics. 1983, 11: 95-103
    [57] B. Luo, E. R. Hancock. Structural graph matching using the EM algorithm and singular value decomposition[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2001, 23(10): 1120-1136
    [58] J. H. Ahn, J. H. Oh. A constrained EM algorithm for principal component analysis[J]. Neural Computation. 2003, 15: 57-65
    [59] B. J. Frey, N. Jojic. Transformation-Invariant Clustering Using the EM Algorithm[J]. IEEE Transactions On Pattern Analysis And Machine Intelligence. 2003, 25(1): 1-17
    [60] S. Choi. Sequential EM learning for subspace analysis[J]. Pattern Recognition Letters. 2004, 25: 1559-1567
    [61] G. Govaert, M. Nadif. An EM Algorithm for the Block Mixture Model[J]. IEEE Transactions On Pattern Analysis And Machine Intelligence. 2005, 27(4): 643-647
    [62] J. W. Ma, S. Q. Fu. On the correct convergence of the EM algorithm for Gaussian mixtures[J]. Pattern Recognition. 2005, 38(12): 2602-2611
    [63] F. Pernkopf, D. Bouchaffra. Genetic-Based EM Algorithm for Learning Gaussian Mixture Models[J]. IEEE Transactions On Pattern Analysis And Machine Intelligence. 2005, 27(8): 1344-1348
    [64] B. B. Zhang, C. S. Zhang, X. Yi. Active curve axis Gaussian mixture models[J]. Pattern Recognition. 2005, 38(12): 2351-2362
    [65] M. A. T. Figueiredo, A. K. Jain. Unsupervised learning of finite mixture models[J]. IEEE Trans Pattern Analysis and Machine Intelligence. 2002, 24(3): 381-396
    [66] B. B. Zhang, C. S. Zhang, X. Yi. Competitive EM algorithm for finite mixture models[J]. Pattern Recognition. 2004, 37: 131-144
    [67] G. Schwarz. Estimating the dimension of a model[J]. Annals of Statistics. 1978, 6: 461-464
    [68] C. Keribin. Consistent estimation of the order of mixture models[J]. Sankhya Ser A. 2000, 62: 49-66
    [69] S. J. Roberts, D. Husmeier, I. Rezek, W. Penny. Bayesian approaches to Gaussian modeling[J]. IEEE Trans Pattern Analysis and Machine Intelligence. 1998, 20: 1133-1142
    [70] M. Beal, Z. Ghahramani. The variational Bayesian EM algorithm for incomplete data: with application to scoring graphical model structures[J]. Bayesian Statistics. 2003, 7:453-464
    [71] S. Richardson, P. J. Green. On Bayesian analysis of mixtures with an unknown number of components[J]. Journal Of The Royal Statistical Society (B). 1997, 59: 731-758
    [72] C. Andrieu, N. de Freitas, A. Doucet, M. I. Jordan. An introduction to MCMC for machine learning[J]. Machine Learning. 2003, 50(1): 5-43
    [73] H. Akaike. A new look at statistical model identification[J]. IEEE Trans on Automatic Control. 1974, 19: 716-723
    [74] J. Rissanen. Stochastic complexity in statistical inquiry[M]. Singapore: World Scientific, 1989
    [75] A. Barron, J. Rissanen. B. Yu. The minimum description length principle in coding and modeling[J]. IEEE Trans Information Theory. 1998, 44: 2743-2760
    [76] M. H. Hansen, B. Yu. Model selection and the principle of minimum description length[J]. J Amer Stat Assoc. 2001, 96: 746-774
    [77] J. Oliver, R. Baxter, C. Wallace. Unsupervised learning using MML[C]. Proceedings of the Thirteenth International Conference on Machine Learning. 1996, 364-372
    [78] M. P. Windham, A. Cutler. Information ratios for validating mixture analysis[J]. Journal of The American Statistical Association. 1992, 87: 1188-1192
    [79] A. Polymenis, D. M. Titterington. On the determination of the number of components in a mixture[J]. Stat Prob Lett. 1998, 38: 295-298
    [80] H. Bozdogan. Choosing the number of component clusters in the mixture model using a new informational complexity criterion of the inverse-Fisher information matrix. O. Opitz, B. Lausen, R. Klar, (Editors) Information and Classification, Heidelberg: Springer-Verlag, 1993. 40-54
    [81] B. Efron. Bootstrap methods: another look at the jackknife[J]. Annals of Statistics. 1979, 7: 1-26
    [82] M. Ishiguro, Y. Sakamoto, G. Kitagawa. Bootstrapping log-likelihood and EIC, an extension of AIC[J]. Ann Instit Star Math. 1997, 49: 411-434
    [83] C. Biernacki, G. Govaert. Using the Classification Likelihood to Choose the Number of Clusters[J]. Computing Science and Statistics. 1997, 29: 451-457
    [84] G. Celeux, G. Soromenho. An entropy criterion for assessing the number of clusters in a mixture model[J]. Journal of Classification. 1996, 13: 195-212
    [85]C. Biernacki, G. Celeux, G. Govaert. Assessing a mixture model for clus- tering with the integrated classification likelihood [J]. IEEE Trans Pattern Analysis and Machine Intelligence. 2000, 22:719-725
    
    [86] V. Vapnik. The nature of statistical learning theory, Statistics for Engi- neering and Information Science Series [M]. New York, NY, USA: Spinger- Verlag, 1995
    
    [87] P. Smyth. Model selection for probabilistic clustering using cross-validated likelihood [J]. Statistics and Computing. 2000, 10:63-72
    
    [88] G. J. McLachlan. On bootstrapping the likelihood ratio test statistic for the number of components in a normal mixture [J]. Applied Statistics. 1987, 36:318-324
    
    [89] M. Miloslavsky, M.J. van der Laan. Fitting of mixtures with unspecified number of components using cross validation distance estimate [J]. Com- putational Statistics & Data Analysis. 2003, 41:413-428
    
    [90] P. Schlattmann. Estimating the number of components in a finite mixture model: the special case of homogeneity [J]. Computational Statistics & Data Analysis. 2003, 41:441-451
    
    [91] D. Dacunha-Castelle, E. Gassiat. The estimation of the order of a mixture model[J]. Bernoulli. 1997, 3:279-299
    
    [92] X. Yang, J. Liu. Mixture density estimation with group membership func- tions [J]. Pattern Recognition Letters. 2002, 23:501-512
    
    [93] M. H. Zhang, Q. S. Cheng. Determine the number of components in a mixture model by the extended KS test[J]. Pattern Recognition Letters. 2004, 25:211-216
    
    [94] H. X. Wang, B. Luo, Q. B. Zhang, S. Wei. Estimation for the number of components in a mixture model using stepwise split-and-merge EM algorithm[J]. Pattern Recognition Letters. 2004, 25:1799-1809
    
    [95] M. H.C. Law, M. A.T. Figueiredo, A. K. Jain. Simultaneous Feature Se- lection and Clustering Using Mixture Models [J]. IEEE Transactions On Pattern Analysis And Machine Intelligence. 2004, 26(9):1154-1166
    
    [96] G. J. McLachlan, S. K. Ng. A comparison of some information criteria for the number of components in a mixture model[R]. Tech. rep., Department of Mathematics, University of Queensland, Brisbane, 2000
    
    [97] N. Vlassis, A. Likas. A greedy EM algorithm for Gaussian mixture learn- ing[J]. Neural Processing Letters. 2002, 15(1):77-87
    [98] J. J. Verbeek, N. Vlassis, B. Krose. Efficient Greedy Learning of Gaussian Mixture Models[J]. Neural Computation. 2003, 5(2): 469-485
    [99] D. Peel, G. J. McLachlan. Robust mixture modelling using the t-distribution[J]. Statistics and Computing. 2000, 10: 339-348
    [100] S. Shoham. Robust clustering by deterministic agglomeration EM of mixtures of multivariate t distributions[J]. Pattern Recognition. 2002, 35:1127-1142
    [101] B. Sutradhar, M. M. Ali. Estimation of parameters of a regression model with a multivariate t-error variable[J]. Communications in Statistics, Theory and Methods. 1986, 15: 429-450
    [102] K. L. Lange, R.J.A. Little, J.M.G. Taylor. Robust statistical modeling using the t-distribution[J]. J Amer Stat Assoc. 1989, 84: 881-896
    [103] S. H. Jeffreys. An alternative to the rejection of observations[J]. Proceedings of the Royal Society of London A. 1932, 137: 78-87
    [104] J. T. Kent, D. E. Tyler. Y. Vardi. A curious likelihood identity for the multivariate t-distribution[J]. Commun Stat—Simulat and Comput. 1994, 23: 441-453
    [105] C. H. Liu. ML estimation of the multivariate t-distribution and the EM algorithm[J]. Journal of Multivariate Analysis. 1997, 63: 296-312
    [106] C. Liu, D. B. Rubin. The ECME algorithm: a simple extension of EM and ECM with faster monotone convergence[J]. Biometrika. 1994, 81: 633-648
    [107] C. Liu, D. B. Rubin. ML estimation of the t-distribution using EM and its extensions, ECM and ECME[J]. Statistica Sinica. 1995, 5: 19-39
    [108] X. L. Meng. D. van Dyk. The EM algorithm-an old folk song sung to a fast new tune (with discussion)[J]. Journal Of The Royal Statistical Society (B). 1997, 59: 511-567
    [109] G. J. McLachlan, R. W. Bean. D. Peel. A mixture model-based approach to the clustering of microarray expression data[J]. Bioinformatics. 2002, 18: 413-422
    [110] M. Svensen, C. M. Bishop. Robust Bayesian mixture modelling[J]. Neurocomputing. 2005, 64(3): 235-252
    [111] H. X. Wang, Q. B. Zhang, B. Luo, S. Wei. Robust mixture modeling using multivariate t-distribution with missing information[J]. Pattern Recognition Letters. 2004, 25: 701-710
    [112] N. Ueda, R. Nakano. Deterministic annealing EM algorithm[J]. Neural Networks. 1998, 11(2): 271-282
    [113] N. Ueda, R. Nakano, Z. Ghahramani, G. E. Hinton. SMEM algorithm for mixture models[J]. Neural Computation. 2000, 12:2109-2128
    
    [114] Z. H. Zhang, C. B. Chen, J. Sun, K. L. Chan. EM algorithms for Gaussian mixtures with split-and-merge operation[J]. Pattern Recognition. 2003, 36(9):1973-1983
    
    [115] M. H. DeGroot, M. J. Schervish. Probability and Statistics[M], 3rd . Pear- son Addison Wesley, 2001
    
    [116] G. H. Ball, D. J. Hall. A clustering technique for summarizing multivariate data[J]. Behaviour Science. 1967, 12:153-155
    
    [117] N. Ueda, R. Nakano. A new competitive learning approach based on an equidistortion principle for designing optimal vector quantizers [J]. Neural Networks. 1994, 7(8):1211-1227
    
    [118] L. Xu. Vector quantization, cluster number selection and the EM algorithms[C]. International Conference on Neural Networks and Signal Pro- cessing. 1995, 149-152
    
    [119] P. J. Green. Reversible jump Markov chain Monte Carlo computation and Bayesian model determination[J]. Biometrika. 1995, 82:711-73
    
    [120] N. Ueda, R. Nakano. Split and merge EM algorithm for improving Gaussian mixture density estimates[J]. J VLSI Signal Process. 2000, 26:133-140
    
    [121] G. H. Golub, C. F. V. Loan. Matrix Computations[M], 3rd . Baltimore: The Johns Hopkins University Press, 1996
    
    [122] E. Clark, A. Quinn. A data-driven Bayesian sampling scheme for unsupervised image segmentation[C]. Int'l. Conf. on Acoustics, Speech, and Signal Processing. 1999, vol. 6, 3497-3500
    
    [123] S. Mannor, R. Meir, T. Zhang. Greedy Algorithms for Classification - Consistency, Convergence Rates, and Adaptivity[J]. Journal of Machine Learning Research. 2003, 4:713-741
    
    [124] J. Q. Li, A. R. Barron. Mixture Density Estimation[C]. Advances in Neural Information Processing Systems. 2000, vol. 12, 279-285
    
    [125] F. Camastra. Data dimensionality estimation methods: a survey [J]. Pat- tern Recognition. 2003, 36:2945-2954
    
    [126] K. Pearson. On lines and planes of closest fit to systems of points in space [J]. Phil Mag. 1901, 2(6):559-572
    
    [127] H. Hotelling. Analysis of a complex of statistical variables into principal components [J]. J Educat Psychology. 1933, 24:417-441, 498-520
    [128] M. Turk, A. P. Pentland. Eigenfaces for recognition[J]. Journal of Cognitive Neuroscience. 1991, 3(1): 71-86
    [129] R. Kuhn, J. C. Junqua, P. Nguyen, N. Niedzielski. Rapid speaker adaptation in eigenvoice space[J]. IEEE Trans on Speech and Audio Processing. 2000, 8(6): 695-707
    [130] P. S. Penev, L. Sirovich. The global dimensionality of face space[C]. Proceedings of IEEE International Conference on Automatic Face and Gesture Recognition. 2000, 264-270
    [131] S. W. Kim, B. J. Oommen. On utilizing search methods to select subspace dimensions for kernel-based nonlinear subspace classifiers[J]. IEEE Trans Pattern Analysis and Machine Intelligence. 2005, 27: 136-141
    [132] B. Scholkopf, A. J. Smola, K. R. Muller. Nonlinear component analysis as a kernel eigenvalue problem[J]. Neural Computation. 1998, 10: 1299-1319
    [133] K. I. Kim, K. Jung, H. J. Kim. Face recognition using kernel principal component analysis[J]. IEEE Signal Processing Letters. 2002, 19: 40-42
    [134] Y. Bengio, O. Delalleau. N. L. Roux. J. F. Paiement, P. Vincent, M. Ouimet. Learning Eigenfunctions Links Spectral Embedding and Kernel PCA[J]. Neural Computation. 2004, 16: 2197-2219
    [135] R. Vidal, Y. Ma, S. Sastry. Generalized Principal Component Analysis (GPCA)[J]. IEEE Trans Pattern Analysis and Machine Intelligence. 2005, 27(12): 1945-1959
    [136] C. M. Bishop. Bayesian PCA[C]. Neural Information Processing Systems. 1999, 382-388
    [137] M. E. Tipping, C. M. Bishop. Probabilistic principal components[J]. Journal of Royal Statistical Society, Series B. 1999, 61(3): 611-622
    [138] M. E. Tipping, C. M. Bishop. Mixtures of probabilistic principal component analysers[J]. Neural Computation. 1999, 11: 443-482
    [139] Z. Wang, L. Lee, S. Fiori, C. S. Leung, Y. S. Zhu. An improved sequential method for principal component analysis[J]. Pattern Recognition Letters. 2003, 24: 1409-1415
    [140] C. Fredembach, M. Schroder, S. Susstrunk. Eigenregions for Image Classification[J]. IEEE Trans Pattern Analysis and Machine Intelligence. 2004, 26(12): 1645-1649
    [141] L. R. Ezequiel, M. O. Juan, M. P. Jose, A. G. Jose. Principal Components Analysis Competitive Learning[J]. Neural Computation. 2004, 16: 2459-2481
    [142] T. Cootes, C. Taylor, D. Cooper, J. Graham. Active Shape Models: Their Training and Application [J]. Computer Vision and Image Understanding. 1995, 61(1):38-59
    [143] T. Cootes, G. Edwards, C. Taylor. Active Appearance Models [J]. IEEE Trans Pattern Analysis and Machine Intelligence. 2001, 23(6):681-685
    [144] B. Luo, R. C. Wilson, E. R. Hancock. A spectral approach to learning structural variations in graphs [J]. Pattern Recognition. 2006, 39 (6): 1188- 1198
    [145] L. Sirovich, M. Kirby. Low-dimensional procedure for the characterization of human faces[J]. Journal of the Optical Society of America (A). 1987, 4(3):519-524
    [146] M. Kirby, L. Sirovich. Application of the Karhunen-Lo(?)ve procedure for the characterization of human faces [J]. IEEE Trans Pattern Analysis and Machine Intelligence. 1990, 12(1):103-108
    [147] A. Pentland, B. Moghaddam, T. Starner. View-based and modular eigenspaces for face recogniton [C]. Proceedings of IEEE International Con ference on Computer Vision and Pattern Recognition. 1994, 84-91
    [148] L. Zhao, Y. Yang. Theoretical analysis of illumina,tion in PCA-based vision systems[J]. Pattern Recognition. 1999, 32:547-564
    [149] A. Pentland. Looking at people: sensing for ubiquitous and wearable computing[J]. IEEE Trans Pattern Analysis and Machine Intelligence. 2000, 22:107-119
    
    [150] V. Perlibakas. Distance measures for PCA-based face recognition [J]. Pat tern Recognition Letters. 2004, 25:711-724
    [151] D. S. Guru, P Punitha. An invariant scheme for exact match retrieval of symbolic images based upon principal component analysis [J]. Pattern Recognition Letters. 2004, 24:73-86
    [152] R. Gottumukkal, V.K. Asari. An improved face recognition technique based on modular PCA approach [J]. Pattern Recognition Letters. 2004, 25:429- 436
    [153] S. Chen, D. Zhang, Z. Z. Zhou. Enhanced (PC)~2 A for face recognition with one training image per person [J]. Pattern Recognition Letters. 2004, 25:1173-1181
    [154] A. N. Rajagopalan, R. Chellappa, N. T. Koterba. Background Learning for Robust Face Recognition with PCA in the Presence of Clutter[J]. IEEE Trans Image Preocessing. 2005, 14(6):832-843
    [155] K. Nishino, S.K. Navar, T. Jebara. Clustered Blockwise PCA for Representing Visual Data[J]. IEEE Trans Pattern Analysis and Machine Intelligence. 2005, 27(10): 1675-1679
    [156] T. Voegtlin. Recursive principal component analysis[J]. Neural Networks. 2005, 18: 1051-1063
    [157] D. Q. Zhang, Z. H. Zhou, S. C. Chen. Diagonal principal component analysis for face recognition[J]. Pattern Recognition. 2006, 39: 140-142
    [158] L. W. Wang, X. Wang, X. R. Zhang, J. F. Feng. The equivalence of two-dimensional PCA to line-based PCA[J]. Pattern Recognition Letters. 2005, 26: 57-60
    [159] S. C. Chen, Y. L. Zhu, D. Q. Zhang, J. Y. Yang. Feature extraction approaches based on matrix pattern: MatPCA and MatLDA[J]. Pattern Recognition Letters. 2005, 26: 1157-1167
    [160] H. Kong, L. Wang, E. K. Tcoh, X. C. Li, J. G. Wang, R. Venkateswarlu. Generalized 2D principal component analysis for face image representation and recognitiion[J]. Neural Networks. 2005, 18: 585-594
    [161] D. Q. Zhang, Z. H. Zhou. (2D)~2 PCA: Two-directional two-dimensional PCA for efficient face representation and recognition[J]. Neurocomputing. 2005, 69: 224-231
    [162] W. M. Zuo, D. Zhang, K. Q. Wang. An assembled matrix distance metric for 2DPCA-based image recognition[J]. Pattern Recognition Letters. 2006, 27(3): 210-216
    [163] A. M. Martinez, A. C. Kak. PCA versus LDA[J]. IEBE Trans Pattern Analysis and Machine Intelligence. 2001, 23(2): 228-233
    [164] M. H. Yang. Kernel eigenfaces vs. kernel fisherfaces: face recognition using kernel methods[C]. Proceedings of IEEE International Conference on Automatic Face and Gesture Recognition. 2002, 215-220
    [165] L. F. Chen, H. M. Liao, M. T. Ko, J. C. Lin, G. J. Yu. A new LDA-based face recognition system which can solve the small sample size problem[J]. Pattern Recognition. 2000, 33: 1713-1726
    [166] H. Yu, J. Yang. A direct LDA algorithm for high-dimensional data-with application to face recognition[J]. Pattern Recognition. 2001, 34: 2067-2070
    [167] J. Yang, J. Y. Yang. Why can LDA be performed in PCA transformed space?[J]. Pattern Recognition. 2003, 36: 563-566
    [168] Y. Xu, J. Y. Yang, Z. Jin. Theory analysis on FSLDA and ULDA[J]. Pattern Recognition. 2003, 36:3031-3033
    
    [169] Y. Xu, J. Y. Yang, Z. Jin. A novel method for Fisher discriminant analysis[J]. Pattern Recognition. 2004, 37:381-384
    
    [170] J. S. Ma, J. L. Sancho-G(?)mez, S.C. Ahalt. Nonlinear Multiclass Discrimi-nant Analysis[J]. IEEE Signal Processing Letters. 2003, 10(7):196-199
    
    [171] J. W. Lu, K. N. Plataniotis, A.N. Venetsanopoulos. Face recognition us- ing kernel direct discriminant analysis algorithms [J]. IEEE Trans Neural Networks. 2003, 14:117-126
    
    [172] J. W. Lu, K. N. Plataniotis, A.N. Venetsanopoulos. Regularization studies of linear discriminant analysis in small sample size scenarios with applica- tion to face recognition [J]. Pattern Recognition Letters. 2005, 26:181-191
    
    [173] M. Ordowski, G.G.L. Meyer. Geometric linear discriminant analysis for pattern recognition [J]. Pattrn Recognition. 2004, 37:421-428
    
    [174] P. Howland, H. Park. Generalizing discriminant analysis using the gener- alized singular value decomposition [J]. IEEE Trans Pattern Analysis and Machine Intelligence. 2004, 26(8):995-1006
    
    [175] G. Baudat, F. Anouar. Generalized discriminant analysis using a kernel approach[J]. Neural Computation. 2000, 12(10):2385-2404
    
    [176] P. Howland, J. L. Wang, H. Park. Solving the small sample size prob- lem in face recognition using generalized discriminant analysis [J]. Pattern Recognition. 2006. 39:277-287
    
    [177] W. M. Zheng, L. Zhao, C. R. Zou. A Modified Algorithm for Generalized Discriminant Analysis[J]. Neural Computation. 2004, 16:1283-1297
    
    [178] J. P. Ye, R. Janardan, C. H. Park, H. Park. An Optimization Criterion for Generalized Discriminant Analysis on Undersampled Problems [J]. IEEE Trans Pattern Analysis and Machine Intelligence. 2004, 26(8):982-994
    
    [179] J. P. Ye," Q. Li. LDA/QR: An efficient and effective dimension reduction algorithm and its theoretical foundation [J]. Pattern Recognition. 2004, 37:851-854
    
    [180] J. P. Ye, Q. Li. A Two-Stage Linear Discriminant Analysis via QRDecomposition[J]. IEEE Trans Pattern Analysis and Machine Intelligence. 2005, 27(6):929-941
    
    [181] M. Loog, R.P.W. Duiin. Linear Dimensionality Reduction via a Heteroscedastic Extension of LDA: The Chernoff Criterion [J]. IEEE Trans Pattern Analysis and Machine Intelligence. 2004, 26(8):732-739
    [182] X. Y. Jing, Y. Y. Tang, D. Zhang. A Fourier-LDA approach for image recognition[J]. Pattern Recognition. 2005, 38: 453-457
    [183] E. K. Tang, P. N. Suganthan, X. Yao, A. K. Qin. Linear dimensionality reduction using relevance weighted LDA[J]. Pattern Recognition. 2005, 38: 485-493
    [184] Y. X. Liang, W. G. Gong, Y. J. Pan. W. H. Li. Generalizing relevance weighted LDA[J]. Pattern Recognition. 2005, 38: 2217-2219
    [185] J. Yang. A. F. Frangi, J. Y. Yang, D. Zhang, Z. Jin. KPCA plus LDA: a complete kernel fisher discriminant framework for feature extraction and recognition[J]. IEEE Trans Pattern Analysis and Machine Intelligence. 2005, 27(2): 230-244
    [186] J. R. Price, T. F. Gee. Face recognition using direct, weighted linear discriminant analysis and modular subspaces[J]. Pattern Recognition. 2005, 38: 209-219
    [187] H. Cevikalp, M. Neamtu, M. Wilkes, A. Barkana. Discriminative common vectors for face recognition[J]. IEEE Trans Pattern Analysis and Machine Intelligence. 2005, 27(1): 4-13
    [188] T. K. Kim, J. Kittler. Locally Linear Discriminant Analysis for Multimodally Distributed Classes for Face Recognition with a Single Model Image[J]. IEEE Trans Pattern Analysis and Machine Intelligence. 2005, 27(3): 318-327
    [189] K. C. Kwak. W. Pedrycz. Face recognition using a fuzzy fisherface classifier[J]. Pattern Recognition. 2005, 38: 1717-1732
    [190] W. M. Zheng. A note on kernel uncorrelated discriminant analysis[J]. Pattern Recognition. 2005, 38: 2185-2187
    [191] G. K. Demir, K. Ozmehmet. Online local learning algorithm for linear discriminant analysisi[J]. Pattern Recognition Letters. 2005, 26: 421-431
    [192] J. Wang. K. N. Plataniotis, A. N. Venetsanopoulos. Selecting discriminant eigenfaces for face recognition[J]. Pattern Recognition Letters. 2005, 26(10): 1470-1482
    [193] H. Tang, T. Fang, P. F. Shi. Laplacian linear discriminant analysis[J]. Pattern Recognition. 2006, 39: 136-139
    [194] H. Tang, T. Fang, P. F. Shi. Nonlinear discriminant mapping using the Laplacian of a graph[J]. Pattern Recognition. 2006, 39: 156-159
    [195] D. Zhou, X. Yang, N. S. Peng, Y. Z. Wang. Improved-LDA based face recognition using both facial global and local information[J]. Pattern Recognition Letters. 2006, 27(6): 536-543
    [196] J. P. Ye, R. Janardan, Q. Li. Two-Dimensional Linear Discriminant Analysis[C]. Proc. Int'l Conf. Neural Information Processing Systems. 2004, 1569-1574
    [197] M. Li, B. Yuan. 2D-LDA: A statistical linear discriminant analysis for image matrix[J]. Pattern Recognition Letters. 2005, 26(5): 527-532
    [198] J. Yang, D. Zhang, X. Yong, J. Y. Yang. Two-dimensional discriminant transform for face recognition[J]. Pattern Recognition. 2005, 38:1125-1129
    [199] H. L. Xiong, M.N.S. Swany, M.O. Ahmad. Two-dimensional FLD for face recognition [J]. Pattern Recognition. 2005, 38: 1121-1124
    [200] X. Y. Jing, H. S. Wong, D. Zhang. Face recognition based on 2D Fisherface approach[J]. Pattern Recognition. 2006, 39(4): 707-710
    [201] P. Nagabhushan, D. S. Guru, B. H. Shekar. (2D)~2FLD: An efficient approach for appearance based object recognition[J]. Neurocomputing. 2006, 69(7-9): 934-940
    [202] D. D. Lee, H. S. Seung. Algorithms for non-negative matrix factorization[C]. Neural Information Processing Systems. 2000, vol. 13, 556-562
    [203] S. Z. Li, X. W. Hou, H.J. Zhang, Q.S. Cheng. Learning Spatially Localized, Parts-Based Representation[C]. IEEE Conf. on Computer Vision and Pattern Recognition. 2001, vol. Ⅰ, 207-212
    [204] W. Xu, X. Liu, Y. H. Gong. Document Clustering Based On Nonnegative Matrix Factorization[C]. ACM SIGIR Special Interest Group on Information Retrieval. 2003, 267-273
    [205] D. Guillamet, J. Vitria, B. Schiele. Introducing a weighted non-negative matrix factorization for image classification[J]. Pattern Recognition Letters. 2003, 24: 2447-2454
    [206] Y. Wang, Y. D. Jia, C. B. Hu, M. Turk. Fisher non-negative matrix factorization for learning local fearures[C]. Asian Conference on Computer Vision. 2004, 806-811
    [207] D. Q. Zhang, S. C. Chen, Z. H. Zhou. Two-dimensional non-negative matrix factorization for face representation and recognition[C]. Proceedings of the 2nd International Workshop on Analysis and Modeling of Faces and Gestures (AMFG'05), in Conjunction with ICCV'05. 2005, vol. LNCS 3723, 350-363
    [208] W. S. Torgerson. Multidimensional Scaling Ⅰ: Theory and Method[J]. Psychometrika. 1952, 17: 401-419
    [209] Shepard. Analysis of proximities: Multidimensional scaling with an unknown distance function Ⅰ & Ⅱ[J]. Psychometrika. 1962, 27:125-140, 219-246
    [210] Y. Bengio, J. F. Paiement, P. Vincent. Out-of-sample extensions for LLE, Isomap, MDS, Eigenmaps and spectral clustering[C]. Neural Information Processing Systems. 2004, vol. 16
    [211] M. Balasubramanian, E. L. Schwartz, J. B. Tenenbaum, V. Silva, J. C. Langford. The Isomap Algorithm and Topological Stability[J]. Science. 2002, 295(5552):7a
    [212] R. Pless. Image Spaces and Video Trajectories: Using Isomap to Explore Video Sequences[C]. Int'l. Conf. on Computer Vision. 2003, 1433-1440
    [213] V. Silva, J. B. Tenenbaum. Global versus lacal methods in nonlinear dimensionality reduction[C]. Proc. Neural Information Processing Systems. 2002, vol. 15, 705-712
    [214] G. Ostrouchov, N. F. Samatova. On FastMap and the Convex Hull of Multivariate Data: Toward Fast and Robust Dimension Reduction[J]. IEEE Trans Pattern Analysis and Machine Intelligence. 2005, 27(8): 1340-1343
    [215] L. K. Saul, S. T. Roweis. Think globally, fit locally: unsupervised learning of low dimensional manifolds[J]. Journal of Machine Learning Research. 2003, 4: 119-155
    [216] D. L. Donoho, C. Grimes. Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data[J]. Proceedings of the National Academy of Sciences of USA (PNAS). 2003, 100(10): 5591-5596
    [217] O. Kouropteva, O. Okun, M. Pietikainen. Incremental locally linear embedding[J]. Pattern Recognition. 2005, 38:1764-1767
    [218] D. Liang, J. Yang, Z.L. Zheng, Y.C. Chang. A facial expression recognition system based on supervised locally linear embedding[J]. Pattern Recognition Letters. 2005, 26: 2374-2389
    [219] H. Chang, D. Y. Yeung. Robust locally linear embedding[J]. Pattern Recognition. 2006, 39(6): 1053-1065
    [220] S. C. Chen, T. K. Sun. Class-information-incorporated principal component analysis[J]. Neurocomputing. 2005, 69: 216-223
    [221] K. R. Tan, S. C. Chen. Adaptively weighted sub-pattern PCA for face recognition[J]. Neurocomputing. 2005, 64: 505-511
    [222] H. C. Kim, K. Kim, S. Y. Bang, S.Y. Lee. Face Recognition Using The Second-Order Mixture-Of-Eigenfaces Method[J]. Pattern Recognition. 2004, 37:337-349
    
    [223] H. C. Kim, K. Kim, S. Y. Bang. Face recognition using the mixture-ofeigenfaces method[J]. Pattern Recognition Letters. 2002, 23:1549-1558
    
    [224] X. F. He, D. Cai, H. F. Liu, W. Y. Ma. Locality Preserving Indexing for Document Representation[C]. Proceedings of the 27th annual international ACM SIGIR. conference on Research and development in information re- trieval. 2004, 96-103
    
    [225] D. Cai, X. F. He. Orthogonal Locality Preserving Indexing[C]. ACM Con- ference on Information Retrieval (SIGIR). 2005, 3-10
    
    [226] W. L. Min, K. Lu, X. F. He. Locality pursuit embedding [J]. Pattern Recog- nition. 2004, 37:781-788
    
    [227] X. Zheng, D. Cai, X. F. He, W. Y. Ma, X. Y. Lin. Locality Preserving Clustering for Image Database[C]. Proceedings of the 12th annual ACM international conference on Multimedia. 2004, 885-891
    
    [228] B. Luo, R. C. Wilson, E. R. Hancock. Graph pattern spaces from Laplacian spectral polynomials[C]. International Conference on Image Analysis and Recognition. 2004, 327-334
    
    [229] X. F. He. Incremental semi-supervised subspace learning for image re- trieval [C]. Proceedings of the 12th annual ACM international conference on Multimedia. 2004, 2-8
    
    [230] K. Lu, X. F. He. Image retrieval based on incremental subspace learning[J]. Pattern Recognition. 2005, 38:2047-2054
    
    [231] J. Cheng, Q. S. Liu, H. Q. Lu, Y. W. Chen. Supervised kernel locality pre- serving projections for face recognition [J]. Neurocomputing. 2005, 67:443- 449
    
    [232] M. S. Bartlett, J. R. Movellan, T. J. Sejnowski. Face recognition by in- dependent component analysis [J]. IEEE Trans Neural Networks. 2002, 13(6):1450-1464
    
    [233] P. C. Yuen, J. H. Lai. Face representation using independent component analysis [J]. Pattern Recognition. 2002, 35:1247-1257
    
    [234] K. Q. Weinberger, L. K. Saul. Unsupervised Learning of Image Manifolds by Semidefinite Programming[C]. IEEE Conf. on Computer Vision and Pattern Recognition. 2004, vol. 2, 988-995
    [235] L. Vandenberghe, S. P. Boyd. Semidefinite programming[J]. SIAM Review. 1996, 38(1): 49-95
    [236] X. F. He, D. Cai, S. C. Yan, H. J. Zhang. Neighborhood Preserving Embedding[C]. IEEE International Conference on Computer Vision (ICCV). 2005, 1208-1213
    [237] D. Cohn. Informed Projections[C]. Advances in Neural Information Procesing Systems. 2002, vol. 15, 849-856
    [238] S. Dasgupta. Experiments with Random Projection[C]. Sixteenth Conference on Uncertainty in Artificial Intelligence. 2000, 143-151
    [239] A. Levin, A. Shashua. Principal Component Analysis over Continuous Subspaces and Intersection of Half-spaces[C]. Proc. of the European Conference on Computer Vision (ECCV). 2002, 635-650
    [240] B. Kegl, A. Krzyzak. Piecewise Linear Skeletonization Using Principal Curves[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2002, 24(1): 59-74
    [241] D. C. Stanford, A. E. Raftery. Finding Curvilinear Features in Spatial Point Patterns: Principal Curve Clustering with Noise[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2000, 22(6): 601-609
    [242] N. Kambhatla, T. K. Leen. Dimension Reduction By Local Principal Component Analysis[J]. Neural Computation. 1997, 9: 1493-1516
    [243] B. L. Zhang, M. Y. Fu, H. Yan. A nonlinear neural network model of mixture of local principal component analysis: application to handwritten digits recognition[J]. Pattern Recognition. 2001, 34(2): 203-214
    [244] G. E. Hinton, M. Revow, P. Dayan. Recognizing Handwritten Digits Using Mixtures of Linear Models[C]. Neural Information Processing Systems. 1994, 1015-1022
    [245] B. J. Frey, A. Colmenarez, T. S. Huang. Mixtures of local linear subspaces for face recognition[C]. Proceedings of IEEE International Conference on Computer Vision and Patter Recognition. 1998, 32-37
    [246] H. C. Kim, D. Kim. S. Y. Bang. Face recognition using LDA mixture model[J]. Pattern Recognition Letters. 2003, 24: 2815-2821
    [247] H. C. Kim. D. Kim, S. Y. Bang. Extensions of LDA by PCA mixture model and class-wise features[J]. Pattern Recognition. 2003, 36: 1095-1105
    [248] D. Ridder, J. Kittler, R. P. W. Duin. Probabilistic PCA and ICA subspace mixture models for image segmentation. The British Machine Vision Conference, 2000. 112-121
    [249] D. S. Turaga, T. Chen. Face Recognition Using Mixtures Of Principal Components[C]. Int'l. Conf. on Image Processing. 2002, vol. 2, 101-104
    [250] H. C. Kim, D. Kim, S. Y. Bang. An efficient model order selection for PCA mixture model[J]. Pattern Recognition Letters. 2003, 24:1385-1393
    [251] A. A. Al-Shaher, E. R. Hancock. Learning mixtures of point distribution models with the EM algorithm [J]. Pattern Recognition. 2003, 36:2805- 2818

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700