用户名: 密码: 验证码:
基于粒子群的优化方法研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
本文利用粒子群优化(Particle Swarm Optimization, PSO)算法对隐马尔可夫模型(Hidden Markov Model, HMM)参数优化问题、模糊聚类问题、K-调和均值(K-harmonic means, KHM)聚类问题和柔性作业车间调度问题(Flexible Job-shop Scheduling Problem, FJSP)进行了研究,提出了相应的优化算法。主要研究内容包括:
     (1)研究了基于PSO的HMM参数优化方法。Baum-Welch(BW)算法是HMM参数优化的经典算法,该算法是基于梯度下降的局部优化算法,容易陷入局部最优。为此本文提出了基于PSO算法和BW算法的连续HMM参数优化算法(PSOBW)。实验结果表明,PSOBW算法不仅能提高PSO算法的收敛速度,而且能帮助BW算法跳出局部最优,PSOBW算法也明显优于基于遗传算法(GA)和BW算法的连续HMM参数优化算法(GABW)。
     (2)研究了基于PSO的模糊聚类方法。针对FCM算法容易陷入局部最优的缺陷提出了一种基于PSO算法和FCM算法的混合模糊聚类算法HPSOFCM。为了将基于PSO算法的混合模糊聚类算法与基于差分进化(Differential Evolution, DE)算法的模糊聚类算法进行比较,我们用DE算法代替HPSOFCM算法中的PSO算法形成了混合模糊聚类算法HDEFCM。实验结果表明,混合模糊聚类算法HPSOFCM和HDEFCM都在某种程度上改进了FCM算法的性能,但HPSOFCM算法比HDEFCM算法速度快。
     (3)研究了基于PSO的KHM聚类算法。为了克服KHM算法容易落入局部最优的缺点,本文提出了一种基于PSO算法和KHM算法的混合聚类算法PSOKHM。我们对算法PSOKHM、KHM和PSO进行实验比较,实验结果表明PSOKHM既提高了PSO算法的收敛速度,也能有效地帮助KHM算法逃出局部最优。
     (4)研究了基于PSO的FJSP的求解方法。提出了一种求解FJSP的离散粒子群算法,该算法采用两个向量表示问题的解。根据问题的特征和解的表示结构,我们借鉴了遗传算法中交叉和变异的思想实现了粒子的运动,并针对该具体问题设计了交叉和变异算子。实验结果证明了所提出算法的有效性。
Optimization is an important subdiscipline of mathematics. Optimization problem is the problem of finding the best solution from all feasible solutions. In mathematics, the term optimization refers to the study of problems in which one seeks to minimize or maximize a real function by systematically choosing the values of real or integer variables from within an allowed set, and the allowed set often is specified by a set of constraints, equalities or inequalities. Classical optimization methods include the analytical methods and the iterative methods. The analytical methods get systems of equations and inequations according to necessary conditions by some mathematical techniques such as differential calculus, variation calculus and Lagrange's method of multipliers, and solve systems of equations and inequations to obtain the solution. The iterative methods make use of some characteristics of the objective functions to solve the problems by successive procedures starting from an initial guess.The classical methods require optimization problem strictly satisfy some mathematical properties. The optimization problem encountered in many application fields is usually very complicated, which possibly have more than one local optimal solutions and the objective functions of which probably is nonconvex or nondifferentiable and even incapable of being expressed. The classical optimization methods are not adequate in solving such problems.
     Evolutionary computation, as a new intelligent optimization technique, is widely applied to many fields of science and engineering. Evolutionary computation, which simulates some natural phenomenon, is superior to other classical optimization methods on global search ability, the ability of solving complicated problems and applicability. PSO is a new swarm intelligence technique, which simulates the behaviors of bird flocking, and is a new branch of evolutionary computation. Since PSO was proposed it has progressively attracted considerable attention from researchers in optimization field due to its characteristics of quick convergent speed, simple principle, easy realization,and just a small amount of parameters to be adjusted. So this thesis researches on Hidden Markov Model (HMM) optimization problems, fuzzy clustering problems, K-Harmonic Means (KHM) clustering clustering problems and Flexible Job-shop Scheduling Problem (FJSP) based on PSO algorithm, and proposes several corresponding algorithms to solve these optimization problems. The main contributions and research achievements of this thesis include:
     1. Summarizing the research on the PSO algorithm. Firstly, the author describes the PSO procedure and analyzes the effect of parameter on its performance. Secondly, the author introduces population topology, several PSO variants and the theoretical achievements. Finally, the author analyses systematically open questions in PSO research.
     2. Studying the HMM optimization based on the PSO algorithm. As a statistics model, HMM have been widely used in speech signal processing and pattern recognition. The problem of optimizing model parameters is one of important problems in this area. The Baum-Welch (BW) algorithm is a popular estimation method due to its reliability and efficiency. However, it is easily trapped in local optimum because of using a gradient descent approach. To overcome the shortcoming of the BW algorithm, we propose a new optimization method (PSOBW) based on the PSO algorithm and the BW algorithm. The Sentence Correct rate, the Word Correct rate and the Word Accuracy rate are used to measure the performance of the algorithms and these criteria are used to evaluate the speech recognition results in HTK Toolkit3.3. Experiments on data from the Census (AN4) database show that the PSOBW algorithm not only overcomes the shortcoming of the slow convergence speed of the PSO algorithm but also helps the BW algorithm escape from local optimum. In addition, we compare the PSOBW algorithm with the optimization method (GABW) based on the genetic algorithm and the BW algorithm. The experimental results indicate that the PSOBW algorithm is superior to the GABW algorithm in the recognition performance.
     3. Studying the fuzzy clustering technique based on the PSO algorithm. One of the most widely used fuzzy clustering models is fuzzy c-means (FCM). Fuzzy c-means is in essence a local search technique that searches for the optimum by using a hill-climbing technique and is prone to converge to a local optimal solution. The hybrid data clustering algorithm based the PSO algorithm and the FCM algorithm called HPSOFCM is proposed in this research. The hybrid clustering algorithm makes full use of the merits of the PSO algorithm and the FCM algorithm. Moreover, we compare the HPSOFCM algorithm with the fuzzy clustering method (DEFCM) based on the differential evolution algorithm and the FCM algorithm. The performances of the HPSOFCM algorithm and the HDEFCM algorithm are compared with those of the FCM algorithm on two artificial data sets and four real data sets. The FCM algorithm, the HPSOFCM algorithm and the HDEFCM algorithm are compared according to the objective function values of the FCM algorithm, Xie-Beni indices and runtimes. Although the FCM algorithm is fastest, it obtains neither the best objective function values nor Xie-Beni indices. The HDEFCM algorithm always gets the best Xie-Beni indices and the better objective function values. The HPSOFCM algorithm always gets the best objective function values and is faster than the HDEFCM algorithm. The HPSOFCM algorithm and the HDEFCM algorithm help the FCM algorithm escape from local optima.
     4. Studying the K-harmonic means clustering technique based on the PSO algorithm. K-harmonic means (KHM) clustering is a new center-based iterative clustering algorithm and solves the problem of initialization using a built-in boosting function, but it also easily runs into local optima. Aiming at the shortcoming of the KHM algorithm, a hybrid data clustering algorithm (PSOKHM) based on the PSO algorithm and the KHM algorithm is proposed in this research, which makes full use of the advantage of both algorithms. The performances of the PSOKHM algorithm and the PSO algorithm are compared with those of the KHM algorithm on two artificial data sets and five real data sets. The algorithms are compared according to the objective function value, the F-Measure and the runtime. Experimental results indicate the superiority of the PSOKHM algorithm and the PSOKHM algorithm not only helps the KHM clustering escape from local optima but also overcomes the shortcoming of the slow convergence speed of the PSO algorithm.
     5. Studying the method for solving the Flexible Job-shop Scheduling Problem (FJSP) based on the PSO algorithm. FJSP extends the Job-shop Scheduling Problem (JSP), in which every operation of a job is allocated a unique machine for processing, and FJSP is harder than JSP. A novel Discrete PSO algorithm for solving the FJSP is proposed in this research, which use two vectors to represent the particle. We make use of the crossover operator and mutation operator from genetic algorithm to redefine the motion equations of a particle. We design a new crossover operator and a new mutation operator for this special problem. The experiments on two benchmark problems show the proposed algorithm is comparable with the AL+CGA method and is superior to the Temporal Decomposition method, the classic genetic method and the AL method with respect to Makespan.
     In the final part of the thesis, the author summarizes the whole research work and prospects the future task.
引文
[1]袁亚湘,孙文瑜.最优化理论与方法[M].北京:科学出版社,1997.
    [2]施光燕,钱伟懿,庞丽萍.最优化方法(第二版)[M].北京:高等教育出版社, 2007.
    [3]卢险峰.最优化方法应用基础[M].上海:同济大学出版社, 2003.
    [4]最优化方法[EB/OL]. http://www.hudong.com/wiki,2009.
    [5]张颖,刘艳秋.软计算方法[M].北京:科学出版社, 2002.
    [6] Kennedy, J., & Eberhart, R. C. Particle swarm optimization[C]. In Proceedings of the IEEE international conference on neural networks IV. Piscataway: IEEE, 1995: 1942–1948.
    [7] Sidhartha Panda, Narayana Prasad Padhy.Optimal location and controller design of STATCOM for power system stability improvement using PSO[J].Journal of the Franklin Institute, 2008,345(2): 166-181.
    [8] Ali Allahverdi, Fawaz S. Al-Anzi.A PSO and a Tabu search heuristics for the assembly scheduling problem of the two-stage distributed database application[J].Computers & Operations Research, 2006,33(4): 1056-1080.
    [9] Li-Yeh Chuang, Hsueh-Wei Chang, Chung-Jui Tu, Cheng-Hong Yang.Improved binary PSO for feature selection using gene expression data[J].Computational Biology and Chemistry, 2008, 32(1):Pages 29-38.
    [10] B. Samanta, C. Nataraj.Use of particle swarm optimization for machinery fault detection[J]. Engineering Applications of Artificial Intelligence, 2009, 22(2):308-316.
    [11] Leonard E. Baum and Ted Petrie.Statistical Inference for Probabilistic Functions of Finite State Markov Chains[J]. The Annals of Mathematical Statistics, 1966, 37(6): 1554-1563.
    [12] L. R. Rabiner. A tutorial on hidden Markov models and selected applications in speech recognition[C]. In Proceedings of the IEEE, 1989, 77(2): 257-285.
    [13] Young-Sun Yun, Yung-Hwan Oh.A segmental-feature HMM for continuous speech recognition based on a parametric trajectory model[J]. Speech Communication, 2002, 38(1-2): 115-130.
    [14] Alexander Seward.A fast HMM match algorithm for very large vocabulary speech recognition[J].Speech Communication, 2004, 42(2): 191-206.
    [15] Suphattharachai Chomphan, Takao Kobayashi.Tone correctness improvement in speaker dependent HMM-based Thai speech synthesis[J].Speech Communication, 2008 50(5): 392-404.
    [16] Randy Gomez, Tomoki Toda, Hiroshi Saruwatari, Kiyohiro Shikano.Techniques in rapid unsupervised speaker adaptation based on HMM-Sufficient Statistics [J].Speech Communication Speech Communication, 2009, 51(1): 42-57.
    [17] Longbiao Wang, Norihide Kitaoka, Seiichi Nakagawa.Robust distant speaker recognition based on position-dependent CMN by combining speaker-specific GMM with speaker-adapted HMM[J].Speech Communication, 2007, 49(6): 501-513.
    [18]裴继红,李翠芸,龚忻.一种新的隐马尔可夫模型及其在手绘图形识别中的应用[J].计算机学报,2005, 28(10):1745-1752.
    [19] Tianming Hu, Liyanage C. De Silva, Kuntal Sengupta.A hybrid approach of NN and HMM for facial emotion classification[J].Pattern Recognition Letters, 2002, 23(11): Pages 1303-1310.
    [20] Hongyu Liu, Jeannette Janssen, Evangelos Milios.Using HMM to learn user browsing patterns for focused Web crawling[J].Data & Knowledge Engineering, 2006, 59(2): 270-291.
    [21] Qi Liu, Yi-Sheng Zhu, Bao-Hua Wang, Yi-Xue Li. A HMM-based method to predict the transmembrane regions ofβ-barrel membrane proteins[J]. Computational Biology and Chemistry, 2003, 27(1): 69-76.
    [22] Qin F, A Auerbach, F Sachs. A direct optimization approach to hidden Markov modeling for single channel kinetics [J]. Biophys. J., 2000, 79:1915-1927.
    [23] Bahl, L.R., Brown, P.F., de Souza, P.V., & Mercer, R.L. Maximum mutual information estimation of hidden Markov model parameters for speech recognition[C]. Acoustics, Speech, and Signal Processing, IEEE International Conference on ICASSP '86.1986, 49 - 52.
    [24] Chou W., Lee C.H., Juang B.H. Minimum Error Rate Training Based on N-Best String Models[C]. Proc.ICASSP'93, Minneapolis, 1993, 652-655.
    [25] Juang, B.H., Chou, W., and Lee, C.H. Minimum classification error rate methods for speech recognition[J]. IEEE Transactions on Speech and Audio Processing, 1997, 5: 266-277.
    [26] Chau, C.W. Kwong, S. Diu, C.K. Fahrner, W.R. Optimization of HMM by a genetic algorithm[C]. IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP-97. 1997, vol.3: 1727-1730.
    [27] S. Kwong, C. W. Chau, K. F. Man and K. S. Tang.Optimisation of HMM topology and its model parameters by genetic algorithms[J]. Pattern Recognition, 2001,34(2): 509-522.
    [28] Tsong-Yi Chen, Xiao-Dan Mei, Jeng-Shyang Pan and Sheng-He Sun. Optimization of HMM by the Tabu Search Algorithm[J].Journal of Information Science and Engineering, 2004, 20(5): 949-957.
    [29] Martin Maca?, Daniel Novák and Lenka Lhotská. Constraints in Particle Swarm Optimization of Hidden Markov Models[J]. Lecture Notes in Computer Science, 2006, 4224: 1399-1406.
    [30] Xue, L. Yin, J.Ji, Z. Jiang, L. A Particle Swarm Optimization for Hidden Markov Model Training[C].8th International Conference on Signal Processing, 2006, Volume 1: 791-794.
    [31]Sébastien Aupetit, Nicolas Monmarchéand Mohamed Slimane. Hidden Markov Models Training by a Particle Swarm Optimization Algorithm[J]. Journal of Mathematical Modelling and Algorithms, 2007, 6(2): 175-193.
    [32] L.A. Zadeh. Similarity relations and fuzzy orderings[J]. Inform. Sci. 1971, 3: 177-200.
    [33]谢维信,高新波,裴继红.模糊聚类理论发展及其应用[J].中国体视学与图像分析,1999, 4(2): 113-119.
    [34] DUNN J C. A fuzzy relative of the ISODATA process and its use in detecting compact well-separated clusters [J]. Journal of Cybernetics, 1973, 3(3): 32-57.
    [35] J. C. Bezdek. Fuzzy Mathematics in Pattern Classification[D].Ph.D Thesis, Cornell University, 1973.
    [36] Jianming Lu, Xue Yuan, Takashi Yahagi.A method of face recognition based on fuzzy clustering and parallel neural networks[J]. Signal Processing, 2006, 86: 2026–2039.
    [37] Luis Tari, Chitta Baral, Seungchan Kim. Fuzzy c-means clustering with prior biological knowledge[J]. Journal of Biomedical Informatics, 2009, 42: 74–81.
    [38] Wei-Che Chen, Ming-Shi Wang.A fuzzy c-means clustering-based fragile watermarking scheme for image authentication[J]. Expert Systems with Applications, 2009, 36: 1300–1307.
    [39] Z. Hou, W. Qian, S. Huang, Q. Hu, W.L. Nowinski.Regularized fuzzy c-means method for brain tissue clustering[J]. Pattern Recognition Letters, 2007, 28: 1788–1794.
    [40] A.B. Goktepe, S. Altun, A. Sezer.Soil clustering by fuzzy c-means algorithm[J]. Advances in Engineering Software, 2005, 36: 691–698.
    [41] Jianzhuang Liu, Weixin Xie. A genetics-based approach to fuzzy clustering[C]. International Joint Conference of the Fourth IEEE International Conference on Fuzzy Systems and The Second International Fuzzy Engineering Symposium, 1995, vol.4: 2233 -2240.
    [42] KHALED S. AL-SULTAN and CHAWKI A. FEDJKI. A TABU SEARCH-BASED ALGORITHM FOR THE FUZZY CLUSTERING PROBLEM[J]. Pattern Recognition, 1997, 30(12): 2023-2030.
    [43]周新华,黄道.一种基于蚁群算法的模糊C均值聚类[J].控制工程, 2005,12(2): 132-134.
    [44]李士勇,赵宝江.一种蚁群聚类算法[J].计算机策略与控制, 2007, 15(11): 1590-1596.
    [45]唐贤伦,庄陵,李银国,曹长修.基于粒子群优化和模糊c均值聚类的入侵检测[J].计算机工程,2008, 34(4): 13-15.
    [46] Zhang, B., Hsu, M., & Dayal, U. K-harmonic means– a data clustering algorithm[R]. Technical Report HPL-1999-124. Hewlett-Packard Laboratories, 1999.
    [47] Zhang, B., Hsu, M., & Dayal, U. K-harmonic Means[C]. In:International Workshop on Temporal, Spatial and Spatio-Temporal Data Mining, TSDM2000. Lyon, France, September 12, 2000.
    [48] Hammerly, G., & Elkan, C. Alternatives to the k-means algorithm that find better clusterings[C]. In: Proceedings of the 11th international conference on information and knowledge management, 2002: 600–607.
    [49] Alperünler, Zülal Güng?r. Applying K-harmonic means clustering to the part-machine classification problem[J]. Expert Systems with Applications, 2009, 36(2): 1179-1194.
    [50] Güng?r, Z., &ünler, A. K-Harmonic means data clustering with simulated annealing heuristic[J]. Applied Mathematics and computation, 2007, 184: 199-209.
    [51] Güng?r, Z., &ünler, A.. K-Harmonic means data clustering with tabu-search method[J]. Applied Mathematical Modelling, 2008, 32:1115-1125.
    [52] Carey MR,Johnson DS,Sethi R.The Complexity of flowshop and job-shop scheduling[J].Mathematics of Operations Research,1976, 1:117-129.
    [53] Xia Weijun, Wu Zhiming. An Effective Hybrid Optimization Approach for Multi-objective Flexible Job-shop Scheduling Problems[J]. Computers & Industrial Engineering, 2005, 48(2):409-425.
    [54] Bucker, P. Schlie R. Job-shop scheduling with multi-puprose machines[J]. Computing, 1990,45: 369– 375.
    [55] F. Pezzella, G. Morganti, G. Ciaschetti. A genetic algorithm for the Flexible Job-shop Scheduling Problem[J].Computers & Operations Research , 2008, 35: 3202– 3212.
    [56] Brandimarte P. Routing and scheduling in a flexible job shop by tabu search[J]. Annals of Operations Research, 1993, 41:157–83.
    [57] Paulli J. A hierarchical approach for the FMS scheduling problem[J]. European Journal of Operational Research, 1995, 86(1):32–42.
    [58] Weijun Xia, Zhiming Wu.An effective hybrid optimization approach for multi-objective flexible job-shop scheduling problems[J]. Computers & Industrial Engineering 2005, 48: 409-425.
    [59] R.J.M. Vaessens, E.H.L Aarts, and J.K. Lenstra. Job shop scheduling by local search[J]. INFORMS Journal on Computing, 1996, vol. 8: 302-317.
    [60] Mastrolilli M, Gambardella LM. Effective neighbourhood functions for the flexible job shop problem[J]. Journal of Scheduling, 1996, 3: 3–20.
    [61] Chen H, IhlowJ, Lehmann C. Agenetic algorithm for flexible Job-shop scheduling[C]. In: IEEE international conference on robotics and automation. Detroit, 1999: 1120–1125.
    [62] Jia HZ, Nee AYC, Fuh JYH, Zhang YF. A modified genetic algorithm for distributed scheduling problems[J]. International Journal of Intelligent Manufacturing, 2003, 14:351–62.
    [63] Ho NB, Tay JC. GENACE: an efficient cultural algorithm for solving the Flexible Job-Shop Problem[C]. IEEE international conference on robotics and automation, 2004: 1759–66.
    [64] Kacem, I.; Hammadi, S.; Borne, P. Approach by localization and multiobjective evolutionaryoptimization for flexible job-shop scheduling problems[J].IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, 2002, 32(1):1-13.
    [65] Millonas M. M. Swarms phase transition and collective intelligence [M]. MA: Addison Wesley, 1994.
    [66] R. Poli, J. Kennedy, and T. Blackwell. Particle swarm optimization[M]. An overview. Swarm Intelligence, 2007, 1(1):33-57.
    [67] Shi, Y., & Eberhart, R. C. A modified particle swarm optimizer[C]. In Proceedings of the IEEE internationalconference on evolutionary computation. Piscataway: IEEE, 1998: 69–73.
    [68] Eberhart, R. C., & Shi, Y. Comparing inertia weights and constriction factors in particle swarm optimization[C]. In Proceedings of the IEEE congress on evolutionary computation (CEC). San Diego, CA. Piscataway: IEEE, 2000: 84–88.
    [69] Eberhart, R. C., & Shi, Y.. Tracking and optimizing dynamic systems with particle swarms[C]. In Proceedings of the IEEE congress on evolutionary computation (CEC). Seoul, Korea. Piscataway: IEEE, 2001: 94–100.
    [70] Zheng, Y.-L., Ma, L.-H., Zhang, L.-Y., & Qian, J.-X.. On the convergence analysis and parameter selection in particle swarm optimization[C]. In Proceedings of the IEEE international conference on machine learning and cybernetics. Piscataway: IEEE, 2003: 1802–1807.
    [71] Kennedy, J. The behavior of particles[C]. In proceedings of the 7-th annual conference on evolutionary programming. V.W. Porto, N. Saravanan, D.Waagen & A. E. Eiben (Eds.),Lecture notes in computer science. Evolutionary programming VII, San Diego, CA. Berlin: Springer, 1998: 581–589.
    [72] Clerc, M., & Kennedy, J. The particle swarm-explosion, stability, and convergence in a multidimensional complex space[J]. IEEE Transaction on Evolutionary Computation, 2002, 6(1): 58–73.
    [73] Kennedy, J., & Mendes, R. Population structure and particle swarm performance[C]. In Proceedings ofthe IEEE congress on evolutionary computation (CEC). Honolulu, HI. Piscataway:IEEE, 2002: 1671–1676.
    [74] Mendes, R., Cortes, P., Rocha, M., & Neves, J.. Particle swarms for feedforward neural net training[C].In Proceedings of the international joint conference on neural networks. Honolulu, HI.Piscataway: IEEE,2002: 1895–1899.
    [75] Mendes, R., Kennedy, J., & Neves, J. Watch thy neighbor or how the swarm can learn from its environment[C]. In Proceedings of the IEEE swarm intelligence symposium (SIS). Piscataway:IEEE, 2003: 88–94.
    [76] Reynolds, C. W. Flocks, herds, and schools: a distributed behavioral model[J]. Computer Graphics, 1987,21(4): 25–34.
    [77] Heppner, H., & Grenander, U. A stochastic non-linear model for coordinated bird flocks[A]. The ubiquity of chaos[C]. Washington DC: AAAS, 1990: 233–238.
    [78] Eberhart, R. C., & Kennedy, J. A new optimizer using particle swarm theory[C]. In Proceedings of the sixth international symposium on micro machine and human science. Nagoya, Japan.Piscataway: IEEE, 1995:39–43.
    [79] Kennedy, J. Small worlds and mega-minds: effects of neighborhood topology on particle swarmperformance[C]. In Proceedings of the IEEE congress on evolutionary computation. Piscataway:IEEE, 1999: 1931–1938.
    [80] Bavelas, A. Communication patterns in task-oriented groups[J]. Journal of the Acoustical Society of America, 1950, 22: 271–282.
    [81] Watts, D. J., & Strogatz, S. H. Collective dynamics of‘small-world’networks[J]. Nature, 1998, 393: 440–442.
    [82] Mendes, R. Population topologies and their influence in particle swarm performance[D]. PhD thesis, Departamento de Informatica, Escola de Engenharia, Universidade do Minho, 2004.
    [83] Suganthan, P. N. Particle swarm optimiser with neighbourhood operator[C]. In Proceedings of the IEEEcongress on evolutionary computation (CEC). Piscataway: IEEE, 1999: 1958–1962.
    [84] Peram, T., Veeramachaneni, K., & Mohan, C. Fitness-distance ratio based particle swarm optimization[C].In Proceedings of the IEEE swarm intelligence symposium (SIS). Indianapolis, IN.Piscataway: IEEE, 2003: 174–181.
    [85] Janson, S., & Middendorf, M. A hierarchical particle swarm optimizer and its adaptive variant[J]. IEEE Transactions on System Man and Cybernetics B, 2005, 35(6): 1272–1282.
    [86] Clerc, M. Particle swarm optimization[M]. London: ISTE, 2006.
    [87] Kennedy, J., & Eberhart, R. C. A discrete binary version of the particle swarm algorithm[C]. In Proceedingsof the conference on systems, man, and cybernetics. Piscataway: IEEE, 1997: 4104–4109.
    [88] Kennedy, J., & Spears, W. M. Matching algorithms to problems: an experimental test of the particleswarm and some genetic algorithms on the multimodal problem generator[C]. In Proceedings internationalconference on evolutionary computation. Piscataway: IEEE, 1998: 78–83.
    [89] Agrafiotis, D. K., & Cede?o, W. Feature selection for structure-activity correlation using binary particle swarms[J]. Journal of Medicinal Chemistry, 2002, 45(5): 1098–1107.
    [90] Mohan, C. K., & Al-Kazemi, B. Discrete particle swarm optimization[C]. In Proceedings of the workshop on particle swarm optimization, Indianapolis, IN, Purdue School of Engineering and Technology,IUPUI ,2001.
    [91] Pampar?, G., Franken, N., & Engelbrecht, A. P. Combining particle swarm optimization with angle modulation to solve binary problems[C]. In Proceedings of the IEEE congress on evolutionary computation(CEC). Piscataway: IEEE, 2005: 225–239.
    [92] Clerc, M.. Discrete particle swarm optimization, illustrated by the traveling salesman problem[M]. InB. V. Babu & G. C. Onwubolu (Eds.), New optimization techniques in engineering. Berlin: Springer, 2004: 219–239.
    [93] Moraglio, A., Di Chio, C., & Poli, R. Geometric particle swarm optimization[C]. Proceedings of the European conference on geneticprogramming (EuroGP). In M. Ebner et al.(Eds.), Lecture notes in computer science: Berlin: Springer, 2007, Vol. 4445, 125–136.
    [94] Hu, X., & Eberhart, R. C. Tracking dynamic systems with PSO: where’s the cheese?[C]. In Proceedings of the workshop on particle swarm optimization. Purdue school of engineering and technology,Indianapolis, IN, 2001.
    [95] Parsopoulos, K. E., & Vrahatis, M. N. Particle swarm optimizer in noisy and continuously changingenvironments[J]. Artificial intelligence and soft computing, 2001: 289–294.
    [96] Carlisle, A., & Dozier, G. Adapting particle swarm optimization to dynamic environments[C]. In Proceedings of international conference on artificial intelligence. Las Vegas, N, 2000: 429–434.
    [97] Carlisle, A., & Dozier, G. Tracking changing extrema with particle swarm optimizer[R]. Auburn University Technical Report CSSE01-08, 2001.
    [98] Hu, X., & Eberhart, R. C. Adaptive particle swarm optimization: detection and response to dynamic systems[C]. In Proceedings of the IEEE congress on evolutionary computation (CEC), Honolulu, HI. Piscataway: IEEE. 2002: 1666–1670.
    [99] Parsopoulos, K. E., & Vrahatis, M. N. On the computation of all global minimizers through particle swarm optimization[J]. IEEE Transactions on Evolutionary Computation, 2004,8: 211–224.
    [100] Blackwell, T., & Bentley, P. J. Don’t push me! Collision-avoiding swarms[C]. In Procee- dings of the IEEE congress on evolutionary computation (CEC). Honolulu, HI. Piscataway: IEEE, 2002: 1691–1696.
    [101] Li, X., & Dam, K. H. Comparing particle swarms for tracking extrema in dynamic environments[C]. In Proceedings of the IEEE congress on evolutionary computation (CEC). Piscataway:IEEE,2003: 1772–1779.
    [102] Janson, S., & Middendorf, M. A hierarchical particle swarm optimizer for dynamicoptimization problems[C]. Proceedings of evoworkshops 2004: 1st European workshop on evolutionary algorithms in stochastic and dynamic environments Coimbra, Portugal. Berlin: Springer, In G. R. Raidl (Ed.), Lecture notes in computer science, 2004, Vol.3005: 513–524.
    [103] Blackwell, T. M., & Branke, J. Multi-swarms, exclusion and anti-convergence on dynamic environments[J].IEEE Transactions on Evolutionary Computation, 2006,10: 459–472.
    [104] Pugh, J., Martinoli, A., & Zhang, Y. Particle swarm optimization for unsupervised robotic learning[C]. In Proceedings of IEEE swarm intelligence symposium (SIS). Piscataway: IEEE, 2005: 92–99.
    [105] Angeline, P. Evolutionary optimization versus particle swarm optimization: Philosophy and performance differences[C]. In Proceedings of evolutionary programming VII, V. W. Porto, N. Saravanan, D. Waagen, & A. E. Eiben (Eds.), Berlin: Springer,1998: 601–610.
    [106] Miranda, V., & Fonseca, N. New evolutionary particle swarm algorithm (EPSO) applied to voltage/VAR control[C]. In Proceedings of the 14th power systems computation conference (PSCC). Seville, Spain, 2002: 1–6.
    [107] Lo?vbjerg, M., Rasmussen, T. K., & Krink, T. Hybrid particle swarm optimiser with breeding and subpopulations[C]. In Proceedings of the third genetic and evolutionary computation conference (GECCO). San Francisco: Kaufmann, 2001: 469–476.
    [108] Wei, C., He, Z., Zhang, Y., & Pei, W. Swarm directions embedded in fast evolutionary programming[C].In Proceedings of the IEEE congress on evolutionary computation (CEC). Honolulu,HI. Piscataway: IEEE, 2002: 1278–1283.
    [109] Robinson, J., Sinton, S., & Rahmat-Samii, Y. Particle swarm, genetic algorithm, and their hybrids:optimization of a profiled corrugated horn antenna[C]. In Proceedings IEEE international symposium onantennas and propagation. San Antonio, TX. Piscataway: IEEE, 2002: 314–317.
    [110] Krink, T., & Lo?vbjerg, M. The LifeCycle model: combining particle swarm optimization, genetic algorithms and hillclimbers[C]. In Proceedings of parallel problem solving from nature (PPSN). Lecture notes in computer science, Granada, Spain, Berlin: Springer, 2002: 621–630.
    [111] Poli, R., & Stephens, C. R. Constrained molecular dynamics as a search and optimization tool[C]. In Proceedings of the 7th European conference on genetic programming (EuroGP), Coimbra, M. Keijzer et al. (Eds.), Lecture notes in computer science, Portugal. Berlin: Springer, 2004, Vol. 3003, 150–161.
    [112]Vesterstro?m, J. S., Riget, J., & Krink, T. Division of labor in particle swarm optimization [C]. In Proceedings of the IEEE congress on evolutionary computation (CEC). Honolulu, HI. Piscataway: IEEE, 2002: 1570–1575.
    [113] Holden, N., & Freitas, A. A. A hybrid particle swarm/ant colony algorithm for the classification of hierarchical biological data[C]. In Proceedings of the IEEE swarm intelligence symposium (SIS), Piscataway: IEEE, 2005: 100–107.
    [114] Hendtlass, T. A combined swarm differential evolution algorithm for optimization problems[C]. In Proceedings of the 14th international conference on industrial and engineering applications of artificial intelligence and expert systems (IEA/AIE). L. Monostori,J. Váncza & M. Ali (Eds.), Lecture notes in computer science, Budapest, Hungary. Berlin: Springer, 2001, Vol. 2070: 11–18
    [115] Zhang, W.-J., & Xie, X.-F. DEPSO: hybrid particle swarm with differential evolution operator[C]. In Proceedings of the IEEE International conference on systems, man and cybernetics (SMCC), Washington, DC. Piscataway: IEEE, 2003: 3816–3821.
    [116] Poli, R., Di Chio, C., & Langdon,W. B. Exploring extended particle swarms: a genetic programming approach[C]. In Proceedings of the 2005 conference on geneticand evolutionary computation (GECCO 2005). H. G. Beyer, et al. (Eds.), Washington, DC. New York: ACM, 2005: 169–176.
    [117] Poli, R., Langdon, W. B., & Holland, O. Extending particle swarm optimization via genetic programming[C].In Proceedings of the 8th European conference on genetic programming. M. Keijzer et al. (Eds.), Lecture notes in computer science, Lausanne, Switzerland. Berlin: Springer, 2005, Vol. 3447: 291–300.
    [118] Iqbal, M., & Montes de Oca, M. A. An estimation of distribution particle swarm optimization algorithm[C].In Proceedings of the fifth international workshop on ant colonyoptimization and swarm intelligence ANTS 2006. M. Dorigo, L. M. Gambardella, M. Birattari, A. Martinoli, R. Poli & T. Stützle (Eds.), Lecturenotes in computer science, Berlin: Springer, 2006, Vol. 4150: 72–83.
    [119] Lo?vbjerg, M., & Krink, T. Extending particle swarms with self-organized criticality[C]. In Proceedings of the IEEE congress on evolutionary computation (CEC-2002). Piscataway: IEEE, 2002:1588–1593.
    [120] Krink, T., Vesterstro?m, J. S., & Riget, J. Particle swarm optimization with spatial particle extension[C].In Proceedings of the IEEE congress on evolutionary computation (CEC - 2002). Piscataway: IEEE, 2002: 1474–1479.
    [121] Xie, X., Zhang, W., & Yang, Z. Dissipative particle swarm optimization[C]. In Proceedings of the IEEE congress on evolutionary computation (CEC). Honolulu, HI. Piscataway: IEEE, 2002: 1456–1461.
    [122] Engelbrecht, A. P. Fundamentals of computational swarm intelligence[M]. Chichester: Wiley, 2005.
    [123] Ozcan, E., & Mohan, C. K. Analysis of a simple particle swarm optimization system[J]. Intelligent Engineering Systems Through Artificial Neural Networks, 1998, 8: 253–258.
    [124] Ozcan, E., & Mohan, C. Particle swarm optimization: surfing the waves[C]. In Proceedings of the IEEE congress on evolutionary computation (CEC). Piscataway: IEEE, 1999: 1939–1944.
    [125] van den Bergh, F. An analysis of particle swarm optimizers[D]. PhD thesis, Department of Computer Science, University of Pretoria, Pretoria, South Africa, 2002.
    [126] Yasuda, K., Ide, A., & Iwasaki, N. Adaptive particle swarm optimization[C]. In Proceedings of the IEEE international conference on systems, man and cybernetics. Piscataway: IEEE, 2003: 1554–1559.
    [127] Blackwell, T. M. Particle swarms and population diversity I: Analysis[C]. In Proceedings of the bird of a feather workshops of the genetic and evolutionary computation conference(GECCO). A. M. Barry (Ed.), Chicago. San Francisco: Kaufmann, 2003: 103–107.
    [128] Blackwell, T. M. Particle swarms and population diversity[J]. Soft Computing, 2005, 9: 793– 802.
    [129] Blackwell, T. M. Particle swarms and population diversity II: Experiments[C]. In Proceedings of the bird of a feather workshops of the genetic and evolutionary computation conference (GECCO). A. M. Barry (Ed.), Chicago. San Francisco: Kaufmann, 2003: 108–112.
    [130] Brandstatter, B., & Baumgartner, U. Particle swarm optimization-mass-spring system analogon[J]. IEEE Transactions on Magnetics, 2002,38(2): 997–1000.
    [131] Trelea, I. C. The particle swarm optimization algorithm: convergence analysis and parameter selection[J]. Information Processing Letters, 2003, 85(6): 317–325.
    [132] Campana, E. F., Fasano, G., & Pinto, A. Dynamic system analysis and initial particles position inparticle swarm optimization[C]. In Proceedings of the IEEE swarm intelligence symposium (SIS), Indianapolis.Piscataway: IEEE, 2006.
    [133] Campana, E. F., Fasano, G., Peri, D., & Pinto, A. Particle swarm optimization: Efficient globally convergent modifications[C]. In Proceedings of the III European conference on computational mechanics, solids, structures and coupled problems in engineering. C. A.Mota Soares, et al. (Eds.), Lisbon, Portugal, 2006.
    [134] Clerc, M. Stagnation analysis in particle swarm optimization or what happens when nothing happens[R].Technical Report CSM-460, Department of Computer Science, University of Essex, August, 2006.
    [135] Kadirkamanathan, V., Selvarajah, K., & Fleming, P. J. Stability analysis of the particle dynamics in particle swarm optimizer[J]. IEEE Transactions on Evolutionary Computation, 2006,10(3): 245–255.
    [136] Rudolph, G. Convergence of evolutionary algorithms in general search spaces[C]. In Proceedings of the IEEE international conference on evolutionary computation. Nayoya University, Japan. Piscataway: IEEE, 1996: 50–54.
    [137] Poli, R., & Broomhead, D. Exact analysis of the sampling distribution for the canonical particle swarm optimiser and its convergence during stagnation[C]. In Genetic and evolutionary computation conference (GECCO), London. ACM, New York, 2007: 134 - 141.
    [138] Poli, R. On the moments of the sampling distribution of particle swarm optimizers[C]. In Proceedings of the workshop on particle swarm optimization: the second decade of the genetic and evolutionary computation conference (GECCO), London, New York: ACM, 2007:2907-2914.
    [139] Poli, R., Wright, A. H., McPhee, N. F., & Langdon, W. B. Emergent behaviour, population- based search and low-pass filtering[C]. In Proceedings of the IEEE world congress on computational intelligence, IEEE congress on evolutionary computation (CEC). Vancouver. Piscataway: IEEE, 2006: 395–402.
    [140] Langdon,W. B., & Poli, R. Finding social landscapes for PSOs via kernels[C]. In Proceedings of the 2006 IEEE congress on evolutionary computation(CEC). G. G. Yen, L.Wang, P.Bonissone, & S. M. Lucas (Eds.), Vancouver, Canada, Piscataway: IEEE, 2006: 6118–6125.
    [141] Poli, R., Langdon, W. B., Marrow, P., Kennedy, J., Clerc, M., Bratton, D., & Holden, N. Communication, leadership, publicity and group formation in particle swarms[C]. InInternational workshop on Ant colony optimization and swarm intelligence (ANTS). Lecture notes in computer science, Berlin: Springer, 2006, Vol. 4150:132–143.
    [142] Langdon, W. B., & Poli, R. Evolving problems to learn about particle swarm and other optimizers[C]. In Proceedings of IEEE congress on evolutionary computation (CEC). D. Corne et al. (Eds.), Edinburgh, UK. Piscataway: IEEE, 2005: 81–88.
    [143] Langdon, W.B.; Poli, R. Evolving Problems to Learn About Particle Swarm Optimizers and Other Search Algorithms[J].IEEE Transactions on Evolutionary Computation,2007,11(5):561– 578.
    [144] Langdon, W. B., Poli, R., Holland, O., & Krink, T. Understanding particle swarm optimiz- ation by evolving problem landscapes[C]. In Proceedings SIS 2005 IEEE swarm intelligence. L. M. Gambardella, P. Arabshahi & A. Martinoli (Eds.), Pasadena, CA. Piscataway: IEEE, 2005: 30–37.
    [145] Blackwell, T. M. Particle Swarm Optimization in Dynamic Environments [M]. Evolutionary Computation in Dynamic and Uncertain Environments. Springer, Berlin, 2007, Volume 51:29-49.
    [146]王炳锡,屈丹,彭煊.实用语音识别基础[M].北京:国防工业出版社,2005.
    [147] Young S.,Jansen J.,Odell J.,Ollason D.,Woodland P.The HTK Book,version 3.3.Distributed with the HTK toolkit [EB/OL]. http://htk.eng.cam.ac.uk/docs/docs.shtml, 2006.
    [148] Young S. Large vocabulary continuous speech recognition: a review[R]. Technical report, Cambridge University Engineering Department, Cambridge, UK, 1996.
    [149]卢坚,陈毅松,孙正兴,张福炎.基于隐马尔可夫模型的音频自动分类[J].软件学报,2002,13(8):1593-1597.
    [150] Shamsul Huda, John Yearwood, Ranadhir Ghosh. A Hybrid Algorithm for Estimation of the Parameters of Hidden Markov Model based Acoustic Modeling of Speech Signals using Constraint-Based Genetic Algorithm and Expectation Maximization[C].6th IEEE/ACIS International Conference on Computer and Information Science, 2007, 438-443.
    [151] X.Hu and R. C. Eberhart. Solving constrained nonlinear optimization problems with particle swarm optimization[C]. Proceedings of the Sixth World Multiconference on Systems, Cybernetics and Informatics, 2002: 1-7.
    [152]范明,孟小峰等译.数据挖掘概念与技术[M].北京:机械工业出版社, 2001.
    [153] Bezdek J C. Pattern Recognition with Fuzzy Objective Function Algorithms[M]. New York: Plenum Press, 1981.
    [154] Babuska FUZZY CLUSTERING[EB/OL]. http://homes.dsi.unimi.it/~valenti/ SlideCorsi/ Bioinformatica05/Fuzzy-Clustering-lecture-Babuska.pdf, 2008.
    [155] Storn, R., Price, K.: Differential Evolution - A Simple and Efficient Adaptive Scheme for Global Optimization over Continuous Spaces[R]. Technical Report TR-95-012, University of California, Berkeley: ICSI, 1995.
    [156] Huang, F., Wang, L., He, Q.: An Effective Co-evolutionary Differential Evolutionfor Constrained Optimization[J]. Applied Mathematics and Computation. 2007, 186:340-356.
    [157] Qian, W., Li, A.: Adaptive Differential Evolution Algorithm for Multiobjective OptimizationProblems[J]. Applied Mathematics and Computation. 2008, 201: 431-440.
    [158] Xie,X.L., Beni, G.: A Validity Measure for Fuzzy Clustering[J]. IEEE Trans. Pattern Anal. Mach. Intell. 1991, 13 (8): 841-847.
    [159] Kao, Y. T., Zahara, E., & Kao, I. W. A hybridized approach to data clustering[J]. Expert Systems with Applications, 2008, 34(3): 1754-1762.
    [160] Cui, X., & Potok, T. E. Document Clustering using Particle Swarm Optimization[J]. IEEE Swarm Intelligence Symposium,Pasadena, California, 2005.
    [161] Dalli, A. Adaptation of the F-measure to Cluster-Based Lexicon Quality Evaluation[C]. EACL 2003, Budapest, 2003.
    [162] Handl, J., Knowles, J., & Dorigo, M. On the performance of ant-based clustering. Design and application of hybrid intelligent systems[J]. Frontiers in Artificial intelligence and Applications, 2003, 104: 204-213.
    [163] Mati Yazid and Xie Xiaolan.The complexity of two-job shop problems with multi-purpose unrelated machines [J]. European Journal of Operational Research, 2004, 152 (1): 159-169.
    [164] Runwei Cheng,Mitsuo Gen,Yasuhiro Tsujimura.A tutorial survey of job-shop scheduling problems using genetic algorithms—I: representation[J].Computers and Industrial Engineering, 1996, 30(4): 983-997.
    [165]F. Chetouane. Ordonnancement d’atelieràtaches généralisées, perturbations, réactivité. DEA Rep., Polytech. Nat. Inst. Grenoble, Grenoble, France, 1995.
    [166] Kacem I, Hammadi S, Borne P.Approach by localization and multiobjective evolutionary optimization for flexible job-shop scheduling problems[J]. IEEE Transactions on Systems, Man, and Cybernetics, Part C, 2002;32(1):1–13.
    [167] Haipeng Zhang, and Mitsuo Gen.Multistage-based genetic algorithm for flexible job-shop scheduling problem[J].Complexity International, 2005, 11, 223-232.
    [168] M. Gen, Y. Tsujimura and E. Kubota. Solving job-shop scheduling problem using genetic algorithms[C]. In Proc. of the 16th Int. Conf. on Computer and Industrial Engineering. Ashikaga, Japan, 1994: 576-579.
    [169] Quan-Ke Pan, M. Fatih Tasgetiren,Yun-Chia Liang.A discrete particle swarm optimization algorithm for the no-wait flowshop scheduling problem[J].Computers & Operations Research,2008,35:2807–2839.
    [170] K. Mesghouni. Application des algorithmesévolutionnistes dans les problèmes d’optimization en ordonnancement de production[D]. Ph.D. dissertation, USTL 2451, France, 1999.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700