用户名: 密码: 验证码:
智能计算及应用研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
本文对智能计算的理论和应用问题进行了研究。针对最小二乘支持向量机丧失了解的稀疏性问题,结合增量式最小二乘支持向量机学习算法和逆学习算法,提出了自适应迭代最小二乘支持向量机回归算法。该算法能够自适应的确定支持向量的数量,保留了QP方法训练支持向量机时解的稀疏性,在相近的精度下,极大地提高了最小二乘支持向量机的学习速度。研究了智能计算技术在基于内容的音乐检索领域的应用。利用音频信号在时域上的短时过零率和短时能频值两个特征实现了音乐中人声段落的提取,基于传统的音高提取模式,使用量化音程描述子描述音乐的旋律特征,使用字符串匹配算法计算两段旋律的相似度。基于上述方法实现了哼唱检索原型系统。使用智能计算技术对原型系统进行了改进,实现了智能化的哼唱检索系统。提出了基于聚类和支持向量机分类相结合的人声片段检测方法,利用对训练样本聚类的结果,为支持向量机挑选训练样本,减少了支持向量机的训练时间。用提出的自适应迭代最小二乘支持向量机回归算法实现了基于回归的分类。使用模糊聚类的方法,用音频信号的聚类质心作为其特征描述,通过比较两段音频信号的聚类质心计算其相似度。针对基本蚁群算法运行时间长,容易陷入局部最优值的缺点,在执行效率和求解能力两方面改进了基本蚁群算法。通过实验,优化了改进算法的参数,并用改进后的蚁群算法求解了一维下料问题。
Research and Application on Intelligent Computing
     Intelligent computing is that people affected by natural (biosphere) law of enlightenment, according to whose principle, copy the problem solving algorithm, through study, self-organization, etc. to synthesize、inductive or reason the Information, in order to establish the mathematical model of symbolism, connectionist or behaviorism. The use of intelligent computing to achieve intelligent information processing is quite different from the traditional information processing. It is a novel "soft computing" approach. Intelligent computing technology is a cross-disciplinary involving physics, mathematics, physiology, psychology, neuroscience, computer science and intelligent technology. As the second-generation artificial intelligence methods, Intelligent computing methods include neural networks, evolutionary, genetic, immune, ecology, artificial life and the subject theory, which have a wide range of applications in multi-disciplinary fields, not only providing new scientific logic and research methods for artificial intelligence and cognitive science, but also providing an effective way processing technology for the information science. Carrying out intelligent computing In-depth research has important theoretical significance and application prospects.
     In this paper, the author studied the theory of intelligent computing and applications.
     Based on statistical learning theory and structural risk minimization principle, Vapnik and his collaborators established a support vector machine theory, which was introduced into the field of machine learning at the meeting of calculation learning theory in 1992. Compared with the traditional machine learning methods, support vector machine has the strong generalization ability in the learning of small samples, and there is no local minimum problem. SVM was originally proposed for the pattern recognition problem. However, with the introduction of s-insensitive loss function by Vapnik, SVM was applied to nonlinear regression and curve fitting, as well as obtained support vector machine regression method, which also demonstrated a very good learning effect.
     But the standard support vector machine training algorithm ultimately comes down to solving a linear inequality constrained quadratic programming problem. As for the case of large-scale data samples, quadratic programming consumes the large resource in time and space, which limits the application of the support vector machines for large-scale data sample size problem, In particular, for the regression problem, the the number of unknown variables in corresponding quadratic programming problem is 2n+1 (n is the number of data samples), as opposed to the classification problem, which has doubled the amount of the unknown and more to increase large-scale data sample problem solving difficulty.
     Suykens, and others made the least squares support vector machine, which skillfully transferred the standard SVM, the inequality constraints into equality constraints. It will make quadratic programming problem converted into solving the problem of linear equations, which greatly reduces the difficulty of learning SVM and improves the training speed. But in the least squares support vector machines, almost every training sample is a support vector, which have lost the advantage of the sparseness of SVM solution. For the shortcoming of least squares support vector machines, this paper combines the incremental learning algorithm and the decremental learning algorithm, proposes the adaptive and iterative least squares support vector machine regression algorithm and gives a matrix reduced-order inversion proof. By default learning precision and the size of training data, this algorithm can adaptively determine the number of support vectors, which is a good solution to their shortcoming of the least-squares support vector machine regression under sparse nature, while greatly enhanced its learning speed.
     Swarm intelligence algorithm is another important research area of intelligent computing, which originated in the study of artificial life. The concept of swarm intelligence was first proposed by Beni, Hackwood, and Wang at the molecular automata system. However, until E Bonabeau and M Dorigo popularized the concept can it obtains a rapid development. In the context of the non-focused and not providing the global model, swarm Intelligence provides the basis to find solutions to complex distributed problems.
     Ant colony algorithm is a kind of swarm intelligence algorithm, which is inspired by the foraging behavior of real ant colony. Its biological basis is that ants will leave the pheromones through a path, as time goes by, which will gradually disappear. In sports, ants can be aware of the existence and the concentration of the substance, and tend to mobile to the areas with high concentrations of pheromone, that is, the probability of selecting the path is proportional to the concentration of pheromone in the path. The higher the pheromone concentration in the path, the more ants there are, then the concentration of pheromone left on the path are higher, which have formed a positive feedback mechanism. Through it, ants can finally find the best path, and most of the ants will take this path.
     But the basic ant colony algorithm has deficiencies, which is that the algorithm running time is very long. On the one hand, this is due to the complexity of algorithm computation. On the other hand, at the initial stage of the algorithm, the pheromones on the paths are not much different and the search of the randomness is high. Until after several iterations, a better advantage of the pheromone on the path can be reflected, so that the initial algorithm consumes a lot of running time. In addition, the positive feedback mechanism of the ant colony algorithm would lead to a path of a high concentration of pheromone, so that all the ants concentrated to this path, while the algorithms converge to local optimal solution prematurely. This paper improved the shortage of the basic ant colony algorithms. The first is the dynamic adjustment of the size of the group. At the early stage of the algorithm's running, the group which maintained a larger scale will help to ensure the multidimensionality of initial feasible solution and avoid premature convergence of algorithm. However, at the late stage of the algorithm's running, in the search towards the optimal solution, too large population size leads to the waste of time. Therefore, the dynamic adjustment of ant colony size is useful to improving the operational efficiency of the algorithm. The second is to establish candidate collection for each individual node, choose the nearest candidate to join a number of nodes in the node set, reduce the computation and improve the operating efficiency. The third is the adoption of optimization-based sorting pheromone update strategy. It only selects part of ants with the best quality solution to update pheromone on their path, which reduces the interference of the poor solution in the search and increases speed. Meanwhile, in order to avoid excessive or pheromones tending to zero by volatile, it sets up on the maximum and minimum threshold for the pheromone on the path. The fourth introduces random transition probabilities to control and make good use of existing good search paths and search for new high-tech prime interest rate the relative importance of the concentration of the path. The 5th simulates individual differences of real ant colony, so that individuals have different routing capabilities. The 6th, adaptive adjustment of the information heuristic factors a and expectations of the heuristic factorβ, as the search progresses, have a gradual increase in value of the parameter to increase the positive feedback. At last, the genetic algorithm crossover and mutation operators to improve the algorithm's global search capabilities.
     The paper applied an improved ant colony algorithm in the combinatorial optimization in one-dimensional cutting stock problem. Through experiments, optimizing the algorithm parameters, the simulation results verify the algorithm for solving one-dimensional cutting stock problem of good performance.
     Content-based music retrieval has developed into a research hotspot in recent years. Since the original audio data is a non semantic notation and unstructured binary stream, besides containing the sampling frequency, quantization precision, encoding these limited registration information, the lack of semantic descriptions of the content and structure of the organization, all have brought great difficulties to the depth of processing and analysis of the audio data. This article has applied intelligent computing technology into the problem, and achieved good results. It put forward for the human voice detection method of short time frame based on unsupervised clustering and support vector machine classification, as well as used the results of clustering support vector machines to select the training samples to reduce the size of training samples and improve the learning speed. At the same time using the proposed adaptive and iterative least squares support vector machine regression algorithm realized the classification of regression. Such combination of voice detection algorithm has been applied to the extraction of human voice fragments and achieved good results. The use of fuzzy clustering to extract the audio clips of cluster centroid as the music features achieved a Query by Humming. The introduction of intelligent methods improves the performance of the prototype system.
引文
[1]王凌.智能优化算法及其应用[M].北京:清华大学出版社,(2001).
    [2]Vapnik V.N. Statistical Learning Theory[M]. New York:Springer-Verlag, (1998).
    [3]Cortes C, Vapnik V.N. Support Vector Networks[J]. Machine Learning, Vol.20, (1995): 273-297.
    [4]Osuna E, Freund R, Girosi F. An improved training algorithm for support vector machines[C]. IEEE Workshop on Neural Networks and Signal Processing, Amelia Island, (1997):276-285.
    [5]Joachims T. Making large-scale support vector machine practical[C], in Advances in Kernel Methods-Support Vector Learning, Cambridge, Massachusetts:The MIT Press, (1999):169-184.
    [6]Collobert R, Bengio S. SVMTorch:a support vector machine for large-scale regression and classification problems[J]. Journal of Machine Learning Research, (2001): 143-160.
    [7]Platt JC. Fast training of support vector machines using sequential minimal optimization[J], in Advances in Kernel Methods-Support Vector Learning, Cambridge, Massachusetts:The MIT Press, (1999):185-208.
    [8]Cauwenberghs G, Poggio T. Incremental and decremental support vector machine learning[J], in Advances in Neural Information Processing Systems, Cambridge, MA: MIT Press, Vol.13, (2001):426-433.
    [9]David M, Alexander F. Active set support vector regression[J]. IEEE Trans. Neural Networks, Vol.3, (2004):268-275.
    [10]QUAN Yong, YANG Jie, YAO Li-Xiu, et al. Successive Overrelaxation for Support Vector Regression[J]. Journal of Software, Vol.15, (2004):200-206.
    [11]Suykens J.A.K., Vandewalle J. Least squares support vector machine classifiers[C]. Neural Processing Letters, Vol.9, (1999):293-300.
    [12]Suykens J.A.K., Brananter J, Lukas L. Weighted least squares support vector machines: robustness and sparse approximation[J]. Neurocomputing, Vol.48, (2002):85-105.
    [13]Suykens J.A.K, Lukas L, Wandewalle J. Sparse approximation using least squares support vector machines[C]. In Proc. of the IEEE International Symposium on Circuits and Systems (ISCAS 2000), Geneva, Switzerland, (2000):757-760.
    [14]14.Kruif B.J., Vries J.A.. Pruning error minimization in least squares support vector machines[J]. IEEE Transaction on Neural Networks, Vol.14, (2003):696-702.
    [15]Zeng XY, Chen XW. SMO-based pruning methods for sqarse least squares support vector machines[J]. IEEE Transactions on Neural Networks, Vol.16, (2005): 1541-1546.
    [16]赵永平,孙健国.一种快速稀疏最小二乘支持向量回归机[J].控制与决策,Vol.23,(2008):1347-1352.
    [17]Suykens J.A.K., Gestel V.T., Brabanter D.J., et al. Least Squares Support Vector Machines[M]. Singapore:World Scientific, (2002).
    [18]Yaakov E, Shie M, Ron M. Sqarse online greedy support vector regression[C]. Proc of European Conf on Machine Learning. Berlin:Springer-Verlag, (2002):84-96.
    [19]21.Jiao LC, Bo LF, Wang L. Fast sparse Approximation for Least Squares Support Vector Machine[J]. IEEE Trans. Neural Netw., Vol.18, (2007):1-13.
    [20]A.Ghias, J.Logan, D.Chamberlin and B.Smith, Query By Humming-Musical Information Retrieval in an Audio Database[C]. in Proc. ACM Multimedia, (1995): 231-236.
    [21]A. L.P Chen, M. Chang, J. Chen. Query by Music Segments:An Efficient Approach for Song Retrieval[C]. In Proc. of IEEE International Conference. Multimedia and Expo, (2000):873-876.
    [22]Y.Kim, W.Chai, R.Garcia, B.Vercoe. Analysis of a Contour Based Representation for Melody[C]. In Proc.of International Symposium on Music Information Retrieval, Oct. (2000).
    [23]Lie Lu, Hong You, Hong-Jiang Zhang, A New Approach to Query by Humming in Music Retrieval[C], In Proc. of IEEE International Conference on Multimedia and Expo(ICME01), (2001):776-779.
    [24]Tom Brondsted et al. A system for recognition of hummed tunes[C]. The COST G-6 Conf on Digital Audio Effects (DAFX-01), Limerick, Ireland, (2001).
    [25]Wold, E. Blum, T.Keislat, and J.Wheaton. Content-Based Classification, Search, and Retrieval of Audio[J]. IEEE Multimedia,Vol3,No.3, (1996):27-36.
    [26]Bickerstaffe A C, Makalic E. MML classification of music genres[C]. Proceedings of the Australian Conference on Artificial Intelligence. Perth, Australia, (2003): 1063-1071.
    [27]Bigerelle M, lost A. Fractal dimension and classification of music[J]. Chaos, Solitons and Fractals, Vol.11, (2000):2179-2192.
    [28]Li T, Oginara M, Li Q. A comparative study on content-based musicgenre classification[C]. in Proc of the 26th annual int ACM SIGIR conf on Research and development in information retrieval, ACM, ACM Press, (2003):282-289.
    [29]Guo G D, Li S Z. Content-based audio classification and retrieval by support vector machines [J]. IEEE Trans on Neural Networks, Vol.14, (2003):209-215.
    [30]Cilibrasi R, Vitanyi P, Wolf R D. Algorithmic clustering of music[C]. Proceedings of the 4th International Conference on Web Delivering of Music. Barcelona, Spain, (2004):110-117.
    [31]Shao X, Xu C S, Kankanhalli M S. Unsupervised classification of music genre using hidden Markov model[C]. Proceedings of the IEEE ICME'04. Taibei, China, Vol.3, (2004):2023-2026.
    [32]Kemp T, Schmidt M, Westphal M, Waibel A. Strategies for automatic segmentation of audio data[C]. Proceedings of the IEEE ICASSP'00. Istanbul, Turkey, (2000):
    1423-1426.
    [33]Pandit T, Kittler J, Li Y, Chilton E. A comparative study of different segmentation approaches for audio track indexing[C]. Proceedings of the IEEE ICPR'00. Barcelona, (2000):467-470.
    [34]Yun yue Zhu, Dennis Shasha. Warping Indexes with Envelope Transforms for Query by Humming[C]. SIGMOD 2003, San Diego, CA, June 9-12, (2003).
    [35]William Rand, William Birmingham. Statistical analysis in music information retrieval[C]. The 2nd Annual Int'l Symp on Music Information Retrieval, Bloomington, Indiana, USA, (2001).
    [36]O Maidin D. A geometrical Algorithm for melodic Difference[J]. Computing in Musicology, Vol.11, (1998):65-72.
    [37]C.Francu and C.G Nevill-Manning. Distance Metrics and Indexing Strategies for a Digital Library of Popular Music[C]. In Proc. of IEEE International Conference on Multimedia and Expo, (2000).
    [38]Gilmore P.C., Gomory R. E.. A linear programming approach to the cutting stock problem[J]. Operations Research,91, (1961):849-859.
    [39]Gilmore P.C., Gomory R. E.. A linear programming approach to the cutting stock problem-part Ⅱ [J]. Operations Research,11, (1963):863-888.
    [40]Baum S., Trotter Jr.L.E. Integer rounding for polymatroid and branching optimization problems[J]. SIAM Journal on Algorithms and Discrete Methods,2/4 (1981):416-425.
    [41]Marcotte O. The cutting stock problem and integer rounding[J]. Mathematical Programming, Vol.33, (1985):82-92.
    [42]Maecotte O. An instance of the cutting stock problem for which the rounding property does not hold[J]. Operations Research Letters, Vol.4, (1986):239-243.
    [43]Scheithauer G., Terno J.. The modified integer round-up property of the one-dimensional cutting stock problem[J]. European Journal of Operational Prsearch, Vol.84, (1995):562-571.
    [44]Stadtler H.. A one-dimensional cutting stock problem in the aluminium industry and its solution[J]. European Journal of Operational Reasearch, Vol.44, (1990):209-223.
    [45]Degraeve Z., Peeters M. Optimal integer solutions to industrial cutting-stock problems: Part 2, Benchmark results[J]. INFORMS Journal on Computing, Vol.15, (2003):58-81.
    [46]Alves C., de Carvalho J.M.v.. Accelerating column generation for variable sized bin-packing problems[J]. European Journal of Operational Research,Vol.183, (2007): 1333-1352.
    [47]Valerio DE carvalho J M. Exact solution of cutting stock problems using column generation and branch and bound[J]. Int. Trans. Opt Res., Vol.5, (1998):35-44.
    [48]Vanderbeck F. Computational study of a column generation algorithm for bin packing and cutting stock problems[J]. Math. Program, Vol.86, (1999):565-594.
    [49]Vance, P.. Branch-and-price algorithms for the one-dimensional cutting stock problem[J]. Computational Optimization and Applications, Vol.9, (1998):211-228.
    [50]Monaci. M.. Algorithms for packing and scheduling problems[D], PhD Thesis, Universita di Bologna, (2002).
    [51]Peeters, M., Degraeve Z.. Branch-and-price algorithms for the dual bin packing and maximum cardinality bin packing problem[J]. European Journal of Operational Research, Vol.170, (2006):416-439.
    [52]Haessler R. W.. A heuristic solution to a nonlinear cutting stock problem[J]. management Science, Vol.17, (1971):793-803.
    [53]Haessler R. W.. Controlling cutting pattern changes in one-dimensional trim problems[J]. Operations Reasearch, Vol.23, (1975):483-493.
    [54]Coverdal I., Warton F.. An improved heuristic procedure for a nonlinear cutting stock problem[J]. Management Science, Vol.23, (1976):78-86.
    [55]Sweeney E., Haessler R. W.. One-dimensional cutting stock decisions for rolls with multiple quality grades[J]. European Journal of Operational Research, Vol.44, (1990): 224-231.
    [56]Gradisar M., Kljajic M., Resinovic G., Jesenko J.. A sequential heuristic procedure for one-dimensional cutting[J]. European Jouranl of Operational Research, Vol.114, (1999):557-568.
    [57]Coffman E.G., Garey M.R., Johnson D.S.. Approximation algorithms for bin-packing: A updated survey[M]. In:G.Ausiello, M.Lucertini(Eds.), Analysis and Design of Algorithms in Combinatorial Optimization, Springer, Berlin, (1984):49-106.
    [58]Haessler R. W., Sweeney P. E.. Cutting stock problems and solution procedures[J]. European Journal of Operation Research, Vol.54, (1991):141-150.
    [59]Gradisar M., Resinovic G., KljajicM.. A hybrid approach for optimization of one-dimensional cutting[J]. European Journal of Operational Reasearch, Vol.119, (1999):719-728.
    [60]Gradisar M., Trkman. P.. A combined approach to the solution to the general one-dimensional cutting stock problem[J]. computers & Operations Reseach, Vol.32, (2005):1793-1807.
    [61]Chen Chuen-Lung S, Hart Stephen M., Tham Wai Mui. A simulated annealing heuristic for the one dimensional cutting stock problem[J]. European Journal of Operation Research, Vol.93, (1995):522-535.
    [62]Vahrenkamp R.. Random search in the one-dimensional cutting stock problem[J], European Journal of Operational Research, Vol.95, (1996):191-200.
    [63]Wagner, B.J.. A genetic algorithm solution for one-dimensional bundled stock cutting[J]. European Journal of Operational Research, Vol.117, (1999):368-381.
    [64]Liang KH, Yao X, Newton Y, Hoffman D.. A new evolutionary approach to cutting stock problems with and without contiguity[J]. Computers and Operations Research, Vol.29, (2002):1641-1659.
    [65]Yang C.T., Sung T.C., Weng W.C.. An improved tabu search approach with mixed objective function for one-dimensional cutting stock problems[J]. Advances in Engineering Software, Vol.37, Issue.8, (2006):502-513.
    [66]Levine J., Ducatelle F.. Ant colony optimization and local search for bin packing and cutting stock problems[J]. Journal of the Operational Research Society, Vol.55, n7, (2004):705-716.
    [67]Saravanan R., Siva Sankar R., Asokan P., Vijayakumar K., Prabhaharan G.. International Journal of Advanced Manufacturing Technology, Vol.26, n1-2, (2005): 30-40.
    [68]Jap. Wendy Japutra, Sutanto. Jofry Hadi, Chiong. Raymond. Proceedings of the 4th IASTED International Conference on Advances in Computer Science and Technology, ACST 2008, (2008):225-229.
    [69]Qiang Lu, Zhiquang Wang, Ming Chen. An ant colony optimization algorithm for the one-dimensional cutting stock problem with multiple stock lengths.4th International Conference on Natural Computation, ICNC2008, Vol.7, (2008):475-479.
    [70]吴正佳,张利平,王魁.蚁群算法在一维下料优化问题中的应用[J].机械科学与技术.Vol.27,n12,(2008):1681-1684.
    [71]M.Methta, R. Agrawal, and J.Rissanen. "SLIQ:A fast scalable classifier for data mining." In Proc.1996 Int.Conf. Extending Database Technology(EDBT'96), Avignon, France, Mar (1996).
    [72]J.C. Shafer, R. Agrawal,& M. Mehta. "SPRINT:A scalable parallel classifier for data mining", Proceedings of the 22nd VLDB Conference Mumbai(Bombay), India, (1996): 544-555.
    [73]J.Gehrke, R.Ramarrishman, and V.Ganti. "Rainforest:A Frame Work for fast decision tree construction of large datasets." In Proc.1998 Int.Conf. Very Large Data Bases(VLDB'98), New York, (1998):416-427.
    [74]A.K.Jain, R.C.Dubes, Algorithms for Clustering Data[M]. Englewood Cliffs, NJ: Prentice-Hall, (1988).
    [75]M.S.Chen, J.Han, P.S.Yu. Data Mining:An Overview from a Database Perspective. IEEE Trans. Knowledge and Data Eng., Dec. (1996):866-883.
    [76]王喆.商务数据中的关联和聚类算法研究[D].吉林大学.(2005).
    [77]Zadeh L A. Fuzzy Sets [J]. Information and Control, Vol.8, (1965):338-353.
    [78]Ruspini E H. A New Approach to Clustering [J]. Information and Control, Vol.19, (1969):22-32
    [79]Bezdek J C. Pattern Recognition with Fuzzy Objective Function Algorithms[M]. NewYork, Plenum Press, (1981)
    [80]Beni G, Wang J. Swarm Intelligence in Cellular Robotic Systems[C]. Proceedings of the Proceed NATO Advanced Workshop on Robots and Biological Systems, Tuscany, Italy, F, (1989).
    [81]Bonabeau E, Dorigo M, Theraulaz G. Swarm Intelligence:From Natural to Artifical Systems[M]. Oxford, (1999).
    [82]Millonas M. Swarms, Phase Transitions and Collective Intelligence[C]. Langton C G Ed. Artificial Life Ⅲ, Santa Fe Institute Studies in the Sciences of Complexity, Vol XVII, Addison-Wesley, (1994):417-445.
    [83]Kennedy J, Eberhart R.C. Particle swarm optimization[C]. Proc. IEEE Intl. Conf. on Neural Networks, IEEE Service Center, Piscataway, NJ, IV:(1995):1942-1948.
    [84]Colorni A, Dorigo M, Maniezzo V. Distributed Optimization by Ant Colonies[C]. The First European conference on Artificial Life. France:Elsevier, (1991):134-142.
    [85]Vapnik VN. The nature of statistical learning theory[M]. New York:Springer,1995.
    [86]LIU Jiang-Hua, CHEN Jia-Pin, JIANG Shan, et al. Online SL-SVM for function and classification[J]. Journal of University of Science and Technology, Beijing, Vol.10, (2003):73-77.
    [87]Bullnheimer B, Hartl RF, Strauss C. A New Rank-based Version of the Ant System:A Computational Study. Central European Journal for Operations Research and Economics. Vol.7, (1999):25-38.
    [88]Maniezzo V, Dorigo M, Colorni A:The Ant System Applied to the Quadratic Assignment Problem[R]. Technical Report IRIDIA/94-28, Universite de Bruxelles, Belgium (1994)
    [89]Diamantaras K.I., Kung S.Y. Principal Component Neural Networks:Theory and Applications[M]. New York:John Wiley and Sons, (1996).
    [90]Flake G.W., Lawrence S. Efficient SVM regression training with SMO[J]. Machine Learning, Vol.46, (2002):271-290.
    [91]Michelle Kruvczuk, Ernest Pusateri, Alison Covell. Music Transcription for Lazy Musician[R]. Final Project Report, May 8, (2000).
    [92]唐强.基于内容的音乐检索研究[D].吉林大学.(2009).
    [93]赵力.语音信号处理[M],机械工业出版社,(2003).
    [94]N Kosuqi, Y Nishihara, T sakata et al. A practical query-by-humming system for a large music database[C]. The ACM Multimedia 2000, Los Angeles, CA, (2000).
    [95]王鹏.面向音乐分类的特征选择方法的研究[D].中国人民大学.(2008).
    [96]李晓黎,刘继敏,史忠植.基于支持向量机与无监督聚类相结合的中文网页分类器[J].计算机学报.Vol.24, No.l, (2001):62-67.
    [97]吴春国.广义染色体遗传算法与迭代式最小二乘支持向量机回归算法研究[D].吉林大学.(2006).
    [98]王天江,陈刚,刘芳.一种按节拍动态分帧的歌曲有歌唱部分检测新方法[J].小型微型计算机系统.Vol.30, No.8, (2009):1561-1564.
    [99]赵雪雁,吴飞,庄越挺,刘俊伟.基于模糊聚类表征的音频例子检索及相关反馈[J].浙江大学学报(工学版).Vol.37, No.3, (2003):264-268.
    [100]Dyckoff H. A typology of cutting and packing problems[J]. European Journal of Operational Research, Vol.44, (1990):145-159.
    [101]"Operations research" textbook compiling group:Operations research. Beijing: Tsinhua university press, (2004).
    [102]DUAN HB:Theory and Application of Ant Colony Algorithm[M]. Beijing: Science press (2005).

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700