用户名: 密码: 验证码:
最小二乘支持向量机算法及应用研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
本文以最小二乘支持向量机为研究对象,围绕其学习算法及应用开展研究。具体研究内容如下:(1)由于超参数选择对支持向量机的性能有着重要作用,本文提出了基于粒子群优化算法选择最小二乘支持向量机超参数的方法,通过实验验证了该方法的有效性。(2)在光谱分析中,特定波长下不同浓度对应不同的吸光度,为了选择出能够更好地区分各种浓度的波段,本文定义了光谱能量指标,提出了确定最佳波段的算法。结合本文提出的最佳波段选择算法,建立了测定混合物中物质浓度的模型。实验结果表明本文所提出的方法能够以较高精度测定物质浓度。(3)针对许多科学和工程问题中广泛应用的第二类线性伏特拉积分方程,本文提出了一种基于最小二乘支持向量机和梯形求积法的第二类线性伏特拉积分方程数值求解方法。(4)针对电力负荷预测问题,在分析了电力负荷规律的基础上,定义了天气状况相似度,从而能够形成动态样本集,并用主成分分析对动态样本集进行处理,最后建立了基于最小二乘支持向量机的电力负荷预测模型。实例预测结果表明,本文提出的预测模型显著优于基于神经网络的电力负荷预测模型。
Support Vector Machine (SVM) is a new and novel powerful machine learning method which is proposed on the basis of the framework of Statistical Learning Theory (SLT). SVM is the learning machine designed for the special case of the number of the limited samples. As a general learning machine, solving SVM is essentially equivalent to solving a convex quadratic programming problem. Because SVM is based on the structural risk minimization (SRM) principle, it can simultaneously control the empirical risk and the complexity of the learning machine. So SVM can effectively avoid the phenomenon of over-fitting and obtain better generalization performance than traditional learning method base on the empirical risk minimization (ERM). Least squares support vector machines (LS-SVMs) are introduced by Suykens et al. as reformulations to standard SVMs which simplify the training process of standard SVM in a great extent by replacing the inequality constraints with equality ones. The introduction of least square support vector machine reduces the complexity of computing and accelerates the speed of calculation. Therefore, support vector machines are greatly promoted for the wider applications. Now, statistical learning theory and support vector machine are considered as best learning theory for the limited samples, and are paid more and more attention, are becoming a new hot spot in the research field of the machine learning and artificial intelligence. This dissertation mainly focuses on the study of LS-SVM algorithms and their applications. The main contributions are as follows:
     1. PSO-based hyper-parameters selection for LS-SVM. SVM is a kernel-based learning algorithm. So the problems of model selection can not bypassed in applications. One is how to select kernel function. In theory, although the function which meets the mercer’s condition can be considered as kernel function, the performance of SVM is different for different kernel function. Even if a certain type of kernel functions have been chosen, its corresponding parameters (such as the order parameter of polynomial kernel functions, the width parameter of radial basis kernel functions) need to selected and optimized. The parametersγin regularization term and kernel function are often called hyper-parameters in SVMs, which play an important role to the algorithm performance. The existing techniques for adjusting the hyper-parameters in SVMs can be summarized into two kinds: one is based on analytical techniques; the other is based on heuristic searches. The first kind of techniques determines the hyper-parameters with gradients of some generalized error measures. Iterative gradient-based algorithms rely on smoothed approximations of a function. So, it does not ensure that the search direction points exactly to an optimum of the generalization performance measure which is often discontinuous. And the second kind of techniques determines the hyper-parameters with modern heuristic algorithms including genetic algorithms, simulated annealing algorithms and other evolutionary strategies. Grid search is one of the conventional approaches to deal with discontinuous problems. However, it needs an exhaustive search over the space of hyper-parameters, which must be time-consuming. This procedure needs to locate the interval of feasible solution and a suitable sampling step. Moreover, when there are more than two hyper-parameters, the manual model selection may become intractable. PSO developed by Eberhart and Kennedy in 1995 is a stochastic global optimization technique inspired by social behavior of bird flocking. Similar to GAs and EAs, PSO is a population based optimization tool, which searches for optima by updating generations. However, unlike GAs and EAs, PSO does not need evolutionary operators such as crossover and mutation. Compared to GAs and EAs, the advantages of PSO are that PSO possesses the capability to escape from local optima, is easy to be implemented, and has fewer parameters to be adjusted. PSO has been successfully applied to optimization, artificial network training, fuzzy system control, and etc. The PSO has been found to be robust and fast in solving non-linear, non-differentiable and multi-modal problems. In this dissertation, a novel PSO-based hyper-parameter selection for LS-SVMs classifiers is presented. The proposed method does not need to consider the analytic property of the generalization performance measure and can determine multiple hyper-parameters at the same time. The feasibility of this method is evaluated using benchmark data sets. The hyper-parameters of LS-SVMs with linear kernel function, polynomial kernel function, radial basis kernel function and scaling radial basis kernel function are optimized. Experimental results show that better performance can be obtained. Considering the importance of all input components for the classification problem, SRBF kernel function takes different scaling factors for all input components. Experimental results also show that the SRBF kernel yields the best test performance and the polynomial and RBF kernel give better test performance. Compared with the results of other methods, the proposed PSO-based hyper-parameter selection for LS-SVMs yields higher accurate rate for all data sets tested in this dissertation. So the proposed method is efficient.
     2. Near infrared spectroscopy (NIRS) and LS-SVMs-based concentration analysis. Near infrared spectroscopy technique is a rapid analysis, non-destructive technique and does not need any sample preparation. Therefore, it widely used in many application fields. With the use of spectroscopy and selecting the optimal calibration wavelength, a strong and stable regression method is required. According to regression model constructed, material concentration in the mixture can be forecasted through inputting the NIRS data. In this dissertation we analyze the NIRS data of different concentration of water and ethanol binary mixture and find that NIR spectra distribution in the region of 1400~1570 nm and 1750~1850nm is dispersed. This shows that these regions include contain the most information of the alcohol concentration. According to the analysis above, the spectra energy index is define to determine the wavelength region (wave band) used to make regression model. An algorithm to select the optimal wave band was proposed. Finally, LS-SVM regression model constructed according to the chosen optimal wave band determine the alcohol concentration in mixture. The experiments results of concentration prediction of water-ethanol mixtures show that the sample construction approach is effective and the LS-SVM outperforms the conventional artificial neural networks (ANN) and partial least square regression (PLSR). The proposed method can be used to predict concentration of other more complexity mixtures.
     3. LS-SVMs-based solve the second king integral equations. Integral equations are the mathematics formulations to describe the physical laws of study object. The mathematical models of many scientific and engineering problems can be summed up as integral equations. Ordinary differential equations usually require additional initial conditions or boundary conditions, while integral equations itself contains the initial value or boundary information. In particular, for the numerical solution of the equation, the approaches to integral equations are often easier and more direct than that of differential equations. All kind of methods of integral equations are built on the basis of the basic theory of integral equations. The support vector machine (SVM) based on statistical learning theory is a powerful tool for function regression and pattern recognition. Motivated by the powerful regression ability of SVMs, we propose a hybrid approach based on LS-SVMs and trapezoid quadrature to solve the second kind linear Volterra integral equations. We approximate the unknown function f (x) by using LS-SVMs and use the approximation of f (x) step by step in the subsequential numerical solution. Results of comparison with analytic solutions show that the proposed algorithm could reach a very high accuracy. The maximization absolute errors are not beyond 10-6 magnitude order, which shows that the proposed algorithm could reach a quite agreeable accuracy. The proposed method is compared with the repeated modified trapezoid quadrature method which shows very good accuracy for solving linear integral equations. The results show that our method outperforms slightly the existing method. Therefore we can conclude that the proposed method is feasible in solving numerically linear Volterra integral equations.
     4. LS-SVMs based for electric power short term load forecasting. Electric power load forecasting is the basis of making decision for power network planning and the precondition of the electric power market. Load forecasting precision is directly related to whether or not to provide safe and high quality power for customers and guarantee power system to run economically. It is important for power department to improve economic benefit. According to the analysis of electric power load law, we find that meteorological factors (maximum temperature, minimum temperature, average temperature, humidity and weather type etc.) in a day have a more important influence on electric power load. Weather status similarity degree is defined because the days with similar weather conditions also have similar load value relatively. Dynamic training samples set can be composed of some historical records with similar weather conditions to forecasting day selected from the whole historical load data. So the time of training model can be saved and the disturbance from irrelevant samples can be avoided. Load forecasting is considered as a kind of multi-character and large-scale problem. Day load curves with similar weather conditions are similar. Principal component analysis is carried out for sample set. So the input feature can be reduced. Finally, LS-SVMs model is made to forecast load. Example prediction results show that forecasting precision of LS-SVMs model outperforms ANN.
     SVM has a solid theoretical foundation and good generalization, so it will be used widely. In this dissertation, we study the improvement and application SVM. This research work will promote the theoretical study of the algorithm and expand its application in the field of pattern recognition.
引文
[1]高隽.智能信息处理方法导论[M].北京:机械工业出版社,2004.
    [2] T.M. Mitchel著,曾华军,张银奎译.机器学习[M].北京:机械工业出版社,2003.
    [3]周志华,王珏.机器学习及应用[M],北京:清华大学出版社,2007.
    [4]李陶深.人工智能[M].重庆:重庆大学出版社,2002.
    [5]王万良.人工智能及其应用[M].北京:高等教育出版社,2005.
    [6] Russell S, Norvig P.著,姜哲,金奕江,张敏,等译.人工智能[M].北京:人民邮电出版社,2004.
    [7]边肇祺,张学工等.模式识别[M].北京:清华大学出版社,2000.
    [8] Vapnik V N. The Nature of Statistical Learning Theory[M]. New York: Springer-Verlag, 1995.
    [9] Vapnik V N. Statistical Learning Theory[M]. New York: Springer-Verlag, 1998.
    [10] Cortes C, Vapnik V N. Support-Vector Networks[J]. Machine Learning, 1995, 20(3): 273-297.
    [11] Boser B, Guyon I, Vapnik V N. A training algorithm for optimal margin classifiers[C]. In: Haussler D eds. Proceedings of Fifth Annual Workshop on Computational Learning Theory. Pittsburgh, PA: ACM Press,1992: 144-152.
    [12] Scholkopf B, Burges C, Vapnik V N. Extracting support data for a given task[C]. In: Fayyad U M, Uthurusamy R eds. Proceedings of First International Conference on Knowledge Discovery and DataMining. Menlo Park, CA: AAAI Press, 1995: 262-267.
    [13]周春光,梁艳春.计算智能[M].长春:吉林大学出版社,2001.
    [14] Suykens J A K, Vandewalle J. Least squares support vector machine classifiers[J]. Neural Processing Letter, 1999, 9: 293-300.
    [15] Suykens J A K, Lukas L, Wandewalle J. Sparse approximation using least squares support vector machines[C]. Proceeding of the IEEE International Symposium on Circuits and Systems. Geneva: IEEE, 2000: 757-760.
    [16] Burges C J C, A Tutorial on Support Vector Machines for Pattern Recognition[J]. Data Mining and Knowledge Discovery, 1998, 2 (2): 121-167.
    [17] Shen J D, Syau Y R, Lee E S. Support vector fuzzy adaptive network in regression analysis[J]. Computers & Mathematics with Applications, 2007, 54 (11-12): 1353-1366.
    [18] Osuna E, Freund R, et al. An improved training algorithm for support vectormachines[C]. Proceedings of 1997 IEEE Workshop on Neural Networks and Signal Processing, Amelia Island, FL: IEEE, 1997: 276-285.
    [19]邓乃扬,田英杰.数据挖掘中的最优化方法——支持向量机[M].北京:科学出版社,2004.
    [20] Courant R, Hilbert D. Method of Mathematical Physics[M]. Wiley InterScience, 1953.
    [21] Bertsekas D P. Nonlinear Programming[M]. Belmont, MA: Athena Scientific, 1995.
    [22] Fletcher R. Practical Methods of Optimization[M]. 2nd ed. New York, NY: Wiley-Interscience, 1987.
    [23] Vapnik V N. Estimation of Dependences Based on Emprical Data[M]. Berlin: Springer-Verlag, 1982.
    [24] Osuna E, Freund R, Girosi F. Support vector Machines: Training and Applocations[R]. AIM-1602, Cambridge MA,USA: MIT, 1997.
    [25] Joachims T. Making large-scale support vector machine practical[M]. In Advances in Kernel Methods-Support Vector Learning. Cambridge, Massachusetts: The MIT Press, 1999: 169-184.
    [26] Platt J C. Fast training of support vector machines using sequential minimal optimization[M]. In Advances in Kernel Methods-Support Vector Learning. Cambridge, Massachusetts: The MIT Press, 1999: 185-208
    [27] Platt J C. Using sparseness and analytic QP to speed training of support vector machines[M]. Advances in Neural Information Processing Systems. Cambridge, MA: MIT Press, 1999,11: 126-134.
    [28] Chang C C, Hsu C W, et al. The analysis of decomposition methods for support vector machines[J]. IEEE Transactions on Neural Networks, 2000, 11(4): 1003-1008.
    [29] Lin C J, et al. On the convergence of the decomposition method for support vector machines[J]. IEEE Transactions on Neural Networks, 2001, 12(6): 1288-1298.
    [30] Lin C J. Asymptotic convergence of an SMO algorithm without any assumptions[J]. IEEE Transactions on Neural Networks, 2002 13(1): 248-250.
    [31] Kao W C, Chung K M, et al. Decomposition methods for linear support vector machines[J]. Neural Computation, 2004, 16(8): 1689-1704.
    [32] Bennett R, Bredensteiner E J. Geomety in Learning[R]. Technical Report, Dept, Math. Sci, Rennselaer Polytechnic Inst., Troy, NY, 1996.
    [33] Bennett K P, Bredensteiner E J. Duality and geometry in SVM classifiers[C]. In: Pat Langley, ed. Porc. of the 17th Int. Conf. Machine Learning, Morgan Kaufmann Publisher, 2000: 57-64.
    [34] Crisp D J, Burges C J C. A geometry interpretaion ofυ-SVM classifiers[J]. NIPS, 2000, 12: 244-250.
    [35] Scholkopf B, Smola A, Williamson R C, Bartlett P L. New support vector algorithms[J]. Neural Computaion, 2000, 12: 1207-1245.
    [36] Tsang I W, Kwok J T, Cheung P M. Core vector machines: Fast SVM training on very large data sets[J]. Journal of Machine Learning Rearch, 2005, 6: 363-392.
    [37] Keerthi S S, Shevade S K, Bhattacharyya C. A fast iterative nearest point algorithm for support vector machine classifier design[J]. IEEE Transactions on Neural Networks, 2000, 11(1): 124-136.
    [38] Llanas B, Fernandez D S M. An iterative algorithm for finding a nearest pair of point in two convex subsets of Rn[J]. Computers and Mathematics with Application, 2000, 40(8-9): 971-983.
    [39] Wang J Q, Tao Q, Wang J. Kernel projection algorithm for large-scale SVM problem[J]. Journal of computer science and technology, 2002, 17(5): 556-564.
    [40] Schlesinger M I, Kalmykov V G, Suchorukov A. Comparative analysis of algorithms synthesizing linear decision rule for analysis of complex hypotheses[J]. Automatika, 1981, 1: 3-9.
    [41] Schlesinger M I, Hlavac V. Ten lectures on statistical and structural pattern recognition[M]. Dordrecht: Kluwer academic publishers, 2002.
    [42] Franc V, Hlavac V. An iterative algorithm learning the maximal margin classifier[J]. Pattern recognition, 2003, 36(9): 1985-1996.
    [43] Badoiu M, Clarkson K L. Optimal core-sets for balls[J]. Computational Geometry, 2008, 40(8): 14-22.
    [44] Gilbert E G. Minimizing the quadratic form on a convex set[J]. SIAM J. Contr, 1996, 4(1): 61-79.
    [45] Mitchell B F, Dem’yanov V F, Malozemov V N. Finding the point of a polyhedron closet to the origin[J]. SIAM J. Contr., 1974, 12: 19-26.
    [46] Roobaert D. DirectSVM: A simple support vector machine perceptron[J]. Journal of VLSI Signal Processing Systems, 2002, 32(1-2): 147-156.
    [47] Syed N A, Liu H, Sung K. Incremental learning with support vector machines[C]. Proceedings of Workshop on Support Vector Machines at the International Joint Conference on Articial Intelligence, Stockholm, Sweden, 1999: 208-307.
    [48] Suykens J A K, Vandewalle J, De Moor B. Optimal control by least squres support vector machine[J]. Neural Network, 2001, 14(2): 23-35.
    [49] Carozza M, Rampone S. Towards an incremental SVM for regression[C]. Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks, Como, Italy, 2000, 6: 405-410.
    [50] Chen J H, Chen CS. Reducing SVM classification time using multiple mirror classifiers[J]. IEEE Transactions on Systems, Man and Cybernetics, 2004, 34(2): 1173–1183.
    [51] Zhang X G. Using class-center vectors to build support vector machines[C]. Proceedings of Neural Networks for Signal Processing, 1999: 3-11.
    [52] Vishwanathan S V M, Narasimha M. SSVM: a simple SVM algorithm[C]. Proceedings of the 2002 International Joint Conference on Neural Networks, 2002, 3: 2393-2398.
    [53] Balcazar J L, Yang D, Watanabe O. Provably fast training algorithms for support vector machines[C]. IEEE International Conference on Data Mining, 2001: 43-50.
    [54] Scholkopf B, Smola A, Williamson R, et al. New support vector algorithms[J]. Neural Computation, 2000, 12 (5): 1207-1245.
    [55] Yang M H, Ahuja N. A geometric approach to train support vector machines[C]. Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2000, 1: 430-437.
    [56] Knerr S, Personnaz L, Dreyfus G. Single-layer learning revisited: A stepwise procedure for building and training a neural network[M]. Neurocomputing: Algorithms, Architectures and Applications. New York: Springer-Verlag, 1990.
    [57] Lee Y J, Mangasarian O L. SSVM: A smooth support vector machines[J]. Computational Optimization and Applications, 2001, 20(1): 5-22.
    [58] Lee Y J, Mangasarian O L. RSVM: Reduced support vector machines[C]. Proceedings of the First SIAM International Conference on Data Mining, 2001.
    [59] Zhang L, Zhou W D, Jiao L C. Wavelet support vector machine[J]. IEEE Transactions on Systems, Man and Cybernetics, Part B, 2004, 34(1): 34-39.
    [60] Fung G, Mangasarian O L. Finite Newton method for Lagrangian support vector machine classification[J].Neurocomputing, 2003, 55( 1-2): 39-55.
    [61] Hsu C W, Lin C J. A comparison of methods for multiclass support vector machines[J]. IEEE Transactions on Neural Networks, 2002, 13(2): 415-425.
    [62] Platt J, Cristianini N, Shawe-Taylor J. Large margin DAG’s for multiclass classification[J]. In Advances in Neural Information Processing Systems, Cambridge, MA: MIT Press, 2000, 1(12): 547–553.
    [63] Bennett K, Blue J. A support vector machine approach to decision trees[R]. Rensselaer Polytechnic Institute, Troy, New York: R.P.I Math Report, 1997, 97-100.
    [64] Cheong S M, Oh S H, Lee S Y. Support vector machines with binary tree architecture for multi-class classification[J]. Neural Information Processing-Letters and Reviews, 2004, 2(3): 47-51.
    [65]李昆仑,黄厚宽,田盛丰.模糊多类SVM模型[J].电子学报,2004,32(5): 830-832.
    [66] Li Y M, Gong S G, Sherrah J, Liddell H. Support vector machine based multi-view face detection and recognition[J]. Image and Vision Computing, 2004,22(5): 413-427.
    [67] Romdhani S, Torr P, Scholkopf B, Blake A. Efficient face detection by a cascaded support-vector machine expansion[C]. Proceeding of the Royal Society of London Series A--Mathematical Physical and Engineering Sciences, 2004, 460(2051): 3283-3297.
    [68] Shih P C, Liu C J. Face detection using discriminating feature analysis and Support Vector Machine[J]. Pattern Recognition, 2006, 39(2): 260-276.
    [69] Tefas A, Kotropoulos C, Pitas I. Using support vector machines to enhance the performance of elastic graph matching for frontal face authentication[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2001, 23(7): 735-746.
    [70] Lee K, Byun H. A new face authentication system for memory-constrained devices[J]. IEEE Transactions on Consumer Electronics, 2003, 49(4): 1214-1222.
    [71] Lee K H, Chung Y W, Byun H. SVM-based face verification with feature set of small size[J]. Electronics Letters, 2002, 38(15): 787-789.
    [72] Sahbi H, Geman D, Boujemaa N. Face detection using coarse-to-fine support vector classifiers[C]. Proceedings of the 2002 International Conference on Image Processing, 2002, 3: 925-928.
    [73] Bahlmann C, Haasdonk B, Burkhardt H. On-line handwriting recognition with support vector machines—a kernel approach[C]. Proceedings of the Eighth International Workshop on Frontiers in Handwriting Recognition, 2002: 49-54.
    [74] Gorgevik D, Cakmakov D. Combining SVM classifiers for handwritten digit recognition[C]. Proceedings of 16th International Conference on Pattern Recognition, 2002, 3: 102-105.
    [75] Dong J X, Krzy?ak A, Suen C. Y. An improved handwritten Chinese character recognition system using support vector machine[J]. Pattern Recognition Letters,,2005, 26(12): 1849-1856.
    [76] Martin-Iglesias D, Bernal-Chaves J, Pelaez-Moreno C, et al. A Speech Recognizer Based on Multiclass SVMs with HMM-Guided Segmentation[C]. Faundez-Zanuy M et al. Eds. Proceeding of Nonlinear Analyses and Algorithms for Speech Processing. Lecture Notes in Artificial Intelligence, 2005, 3817: 257-266.
    [77] Wan V, Renals S. Speaker Verification Using Sequence Discriminant Support Vector Machines[J]. IEEE Transactions on Speech and Audio Processing, 2005, 13(2): 203-210.
    [78] Peng G, Wang W S Y. Tone recognition of continuous Cantonese speech based on support vector machines[J]. Speech Communication, 2005, 45(1): 49-62.
    [79] Campbell W M. A SVM/HMM system for speaker recognition[C]. Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003, 2: II-209-212.
    [80] Kim K I, Jung K, Park S H, Kim H J. Support vector machine-based textdetection in digital video[J]. Pattern Recognition, 2001, 34 (2): 527-529.
    [81] Liu Y, Loh H T, Tor SB. Comparison of extreme learning machine with support vector machine for text classification[C]. Innovations in Applied Artificial Intelligence, Lecture Notes in Artificial Intelligence, 2005, 3533: 390-399.
    [82] Ishii N, Murai T, Yamada T, et al. Text classification: Combining grouping, LSA and KNN vs support vector machine[C]. Lecture Notes in Artificial Intelligence, 2006, 4252: 393-400.
    [83] Anghelescu A V, Muchnik I B. Combinatorial PCA and SVM methods for feature selection in learning classifications (Applications to text categorization)[C]. International Conference on Integration of Knowledge Intensive Multi-Agent Systems, 2003: 491-496.
    [84] Melgani F, Bruzzone L. Classification of hyperspectral remote sensing Images with support vector machines[J]. IEEE Transactions on Geoscience and Remote Sensing, 2004, 42(8): 1778-1790.
    [85] Chi M M, Bruzzone L. Semisupervised classification of hyperspectral images by SVMs optimized in the primal[J]. Transactions on Geoscience and Remote Sensing, 2007, 45(6): 1870-1880.
    [86] Chi M M, Feng R, Bruzzone L. Classification of hyperspectral remote sensing data with primal SVM for small-sized training dataset problem[J]. Advance in space research, (2008) 41(11):1793-1799.
    [87] Gestel TV, Suykens J A K, et al. Financial time series prediction using least squares support vector machines within the evidence framework[J]. IEEE Trans Neural Networks, 2001, 4: 809-821.
    [88] Yang Y, Chen S, Ye Z B. Combination of particle-swarm optimization with least-squares support vector machine for FDTD time series forecasting[J]. Microwave and Optical Technology Letters, 2006, 48(1): 141-144.
    [89] Li H C, Zhang J S. Local prediction of chaotic time series based on support vector machine[J]. Chinese Physics Letters, 2005, 22(11): 2776-2779.
    [90] Chapelle O, Haffner P, Vapnik V N. Support vector machines for histogram-based image classification[J]. IEEE Transactions on Neural Networks, 1999, 10(5): 1055-1064.
    [91] Zheng S, Liu J, Tian J W. A new efficient SVM-based edge detection method[J]. Patter rcognition, 2004, 25(10): 1143-1154.
    [92] Kuh A. Adaptive kernel methods for CDMA systems[C]. Proceedings of International Joint Conference on Neural Networks, 2001, 4: 2404-2409.
    [93] Chen S, Hanzo L. Block-adaptive kernel-based CDMA multiuser detection[C]. IEEE International Conference on Communications, 2002, 2: 682-686.
    [94] El-Naqa I, Yang Y Y, et al. A support vector machine approach for detection of microcalcifications[J]. IEEE Transactions on Medical Imaging, 2002, 21(12):1552-1563.
    [95] Osowski S, Hoai L T, Markiewicz T. Support vector machine-based expert system for reliable heartbeat recognition[J]. IEEE Transactions on Biomedical Engineering, 2004, 51(4): 582-589.
    [96] Land W H, Wong L, Mckee D W, et al. Breast cancer computer aided diagnosis (CAD) using a recently developed SVM/GRNN Oracle hybrid[C]. IEEE International Conference on Systems, Man and Cybernetics, 2003, 5: 4705-4711.
    [97] Nurettin A. Classification of ECG beats by using a fast least square support vector machines with a dynamic programming feature selection algorithm[J]. Neural Computing & Applications, 2005, 14(4): 299-309.
    [98] Tao S H, Chen D Z, Hu W M. SVD-LSSVM and its application in chemical pattern classification[J]. Journal of Zhejiang University SCIENCE A, 2006, 7(11): 1942-1947.
    [99] Yao X, Liu H, et al. QSAR and classification study of 1,4-dihydropyridine calcium channel antagonists based on least squares support vector machines[J]. Molecular Pharmacology, 2005, 2(5): 348 -356.
    [100] Zheng S, Tian J, Liu J, Xiong C. Novel algorithm for image interpolation[J]. Optical Engineering, 2004, 43(4): 856-865.
    [101] Chua K S. Efficient computations for large least square support vector machine classifiers[J]. Pattern Recognition Letters, 2003, 24(1-3): 75-80.
    [102] Polat K, Gunes S. A novel approach to estimation of E. coli promoter gene sequences: Combining feature selection and least square support vector machine (FS_LSSVM)[J]. Applied Mathematics and Computation, 2007, 190(2): 1574-1582.
    [103] Polat K, Gunes S. Detection of ECG Arrhythmia using a differential expert system approach based on principal component analysis and least square support vector machine[J]. Applied Mathematics and Computation, 2007, 186(1): 898-906.
    [104] Polat K, Kara S, Latifoglu F, Gunes S. Pattern detection of atherosclerosis from carotid artery doppler signals using fuzzy weighted pre-processing and least square support vector machine (LSSVM)[J]. Annals of Biomedical Engineering, 2007, 35 (5): 724-732.
    [105] Tan F, Feng X, et al. Prediction of mitochondrial proteins based on genetic algorithm-partial least squares and support vector machine[J]. Amino Acids, 2007, 33: 669-675.
    [106] Li J H, Liu H X, et al. Quantitative structure–activity relationship study of acyl ureas as inhibitors of human liver glycogen phosphorylase using least squares support vector machines[J]. Chemometrics and Intelligent Laboratory Systems, 2007, 87(2): 139-146.
    [107] Mitra V, Wang C J, Banerjee S. Text classification: A least square support vector machine approach[J]. Applied Soft Computing, 2007, 7(3): 908–914.
    [108] Liu F, He Y. Use of visible and near infrared spectroscopy and least squares-support vector machine to determine soluble solids content and pH of cola beverage[J]. J. Agric. Food Chem, 2007, 55: 8883–8888.
    [109] Chapelle O, Vapnik V N, et al. Choosing multiple parameters for support vector machines[J]. Machine Learning, 2002, 46 (1): 131-159.
    [110] Vapnik V N, Chapelle O. Bounds on error expectation for support vector machines[J]. Neural Computation, 2000, 12 (9): 2013-2036.
    [111] Gold C, Sollich P. Model selection for support vector machine classification[J]. Neurocomputing, 2003, 55(1-2): 221-249.
    [112] Chung K M, Kao W C, et al. Radius margin bound for support vector machines with RBF kernel[J]. Neural Computation, 2003, 15(11): 2643-2681.
    [113] G. Wahba, X. Lin, et al. The bias-variance trade-off and the randomized gacv[C]. In: M. Kearns, S. Solla, D. Cohn Eds. Advances in Neural Information Processing Systems 11, Proceedings of the 1998 Conference, MIT Press, Cambridge, MA, 1999, 11: 620–626.
    [114] Keerthi S S. Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms[J]. IEEE Transactions on Neural Network, 2002, 13 (5): 1225-1229.
    [115] Ayat N E, Cheriet M, Suen C Y. Automatic model selection for the optimization of SVM kernels[J]. Pattern Recognition, 2005, 38(10): 1733-1745.
    [116] Gold C, Sollich P. Model selection for support vector classification[J]. Neurocomputing, 2003, 55(1-2): 221-249.
    [117] Glasmachers T, Igel C. Gradient-based adaptation of general gaussian kernels[J]. Neural Computation, 2005, 17(10): 2099-2105.
    [118] M.W. Chang, C. J. Lin. Leave-One-Out bounds for support vector regression model selection[J]. Neural Computation, (2005) 17(5): 1188-1222.
    [119] D.R. Eads, D. Hill, et al. Genetic algorithms and support vector machines for time series classification[C]. In: B. Bosacchi, D. B. Fogel, J. C. Bezdek Eds. Applications and science of neural networks, fuzzy systems and evolutionary computation V. Proceedings of the SPIE, 2002, 4787: 74–85.
    [120] H. Frohlich, O. Chapelle, B. Scholkopf. Feature selection for support vector machines by means of genetic algorithms[C]. Proceedings of the 15th IEEE International Conference on Tools with AI (ICTAI 2003), IEEE Computer Society, Institute of Electrical and Electronics Engineers Inc., USA, 2003, 142–148.
    [121] K. Jong, E. Marchiori, A. van der Vaart. Analysis of proteomic pattern data for cancer detection[C]. In: G. R. Raidl, S. Cagnoni, et al. (Eds.). Applications of Evolutionary Computing, Lecture Notes in Computer Science, Springer Berlin, Heidelberg, 2004, 3005: 41-51.
    [122] Miller M T, Jerebko A K, et al. Feature selection for computer-aided polypdetection using genetic algorithms[C]. Proceedings of SPIF. 2003. 5031, 102-110.
    [123] Pai P F, Hong W C. Support Vector Machines with Simulated Annealing Algorithms in Electricity Load Forecasting[J], Energy Conversion & Management, 2005, 46(17) 2669-2688.
    [124] Friedrichs F, Igel C. Evolutionary tuning of multiple SVM parameters[J]. Neurocomputing, 2005, 64(C): 107-117.
    [125] Bazi Y, Melgani F. Semisupervised PSO-SVM regression for biophysical parameter estimation[J]. IEEE Transactions on Geoscience and Remote Sensing, Part 2, 2007, 45(6): 1887-1895.
    [126] Tang L J, Zhou Y P, et al. Radial basis function network-based transform for a nonlinear support vector machine as optimized by a particle swarm optimization algorithm with application to QSAR studies[J]. Journal of Chemical Information and Modeling, 2007, 47(4): 1438-1445.
    [127] An S J, Liu W Q, Venkatesh S. Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression[J]. Pattern Recognition, 2007, 40(8): 2154-2162.
    [128] Gestel T V, Suykens J A K, et al. Benchmarking least squares support vector machine classifiers[J]. Machine Learning, 2004, 54(1): 5-32.
    [129] Kennedy J, Eberhart R. Particle swarm optimization[C]. Proceedings of the IEEE International Conference on Neural Networks, Perth, Australia, IEEE Service Center, Piscataway, NJ, 1995, 4: 1942–1948.
    [130] Clerc M, Kennedy J. The particle swarm-explosion, stability and convergence in a multidimensional complex space[J]. IEEE Trans. on Evolutionary Computation, 2002, 6 (1): 58-73.
    [131] Ratsch G, Onoda T, Muller K R. Soft margins for AdaBoost[J]. Machine Learning, 2001, 42 (3): 287–320.
    [132] Meinicke P, Twellmann T, Ritter H. Discriminative densities from maximum contrast estimation[M]. In: S. Becker, S. Thrun, K. Obermayer Eds. Advances in Neural Information Processing Systems 15, MIT Press, Cambridge, 2002: 985–992.
    [133]闵顺耕,覃方丽,李宁,于飞键.傅里叶变换近红外光谱法测定大麦中蛋白质、淀粉和赖氨酸含量[J].分析化学,2003,31(7):843-845.
    [134]任玉林,张滨,郭晔等.近红外反射光谱法对液态样品的非破坏性分析[J].光谱学与光谱分析,1997,17(1):50-53.
    [135]任玉林,邴春亭,逯家辉,郭晔.近红外漫反射光谱的主成分分析[J].光谱学与光谱分析,1996,16(6):31-35.
    [136]胡钢亮,吕秀阳,罗玲,徐铸德.近红外光谱法同时测定银杏提取液中总黄酮和总内酯含量[J].分析化学,2004,32(8):1061-1063.
    [137] Ingle J D J, Crouch S H. Spectrochemical Analysis[M]. New York: Prentice-Hall Englewood Cliffs, 1988.
    [138]吴瑾光.代傅立叶变换近红外光谱技术及应用[M].北京:科学技术文献出版社,1994.
    [139]李宁,闵顺耕,覃方丽等.近红外光谱法非破坏性测定黄豆籽粒中蛋白质、脂肪含量[J].光谱学与光谱分析,2004,24(1):45-49.
    [140]郑洪,吴敏,李东辉,等.近红外分光光度法测定核酸[J].厦门大学学报,2000,39(2):190-194.
    [141]邵咏妮,何勇.可见/近红外光谱预测极梅法酸度的方法研究[J].外与毫米波学报,2006,25(6):478-480.
    [142]李晓丽,胡兴越,何勇.基于主成分和多类判别分析的可见/红外光谱水蜜桃品种鉴别新方法[J].红外与毫米波学报,2006,25(6):417-420.
    [143]虞科,程翼宇.一种基于最小二乘支持向量机算法的近红外光谱判别分析方法[J].分析化学,2006,34(4):561-564.
    [144] Tang H S, Xue ST, et al. Online weighted LS-SVM for hysteretic structural system identification[J]. Engineering Structures, 2006, 28 (12): 1728-1735.
    [145] Golbabai A, Seifollahi S. Numerical solution of the second kind integral quations using radial basis function networks[J]. Appl Math Comput, 2006, 174: 877-883.
    [146] Golbabai A, Seifollahi S. An iterative solution for the second kind integral equations using radial basis functions[J]. Appl Math Comput, 2006, 181(2): 906-907.
    [147] Jafar S N, Mahdi H. A quadrature method with variable step for solving linear Volterra integral equations of the second kind[J]. Appl Math Comput, 2007, 188(1): 549-554.
    [148] Jafar S N, Mahdi H. Solving linear integral equations of the second kind with repeated modified trapezoid quadrature method[J]. Appl Math Comput, 2007, 189(1): 980-985.
    [149] Maleknejad K, Shahrezaee M. Using Runge-Kutta method for numerical solution of the system of Volterra integral equation[J]. Appl Math Comput, 2004, 149(2): 399-410.
    [150] Maleknejad K, Aghazadeh N. Numerical solution of Volterra integral equations of the second kind with convolution kernel by using Taylor-series expansion method[J]. Appl Math Comput, 2005, 161(3): 915-922.
    [151] Rashed M T. Numerical solutions of functional integral equations[J]. Appl Math Comput, 2004, 156(2): 507-512.
    [152] Chen C, Tian Y X, Zou X Y, et al. Prediction of protein secondary structure content using support vector machine[J]. Talanta, (2007) 71 (5): 2069-2073
    [153] Kim K J. Financial time series forecasting using support vector machines[J]. Neurocomputing, 2003, 55(1-2): 307-319.
    [154] Kong W M, Choo K W. Predicting single nucleotide polymorphisms (SNP) from DNA sequence by support vector machine[J]. Frontiers in Bioscience, 2007, 12: 1610-1614.
    [155]魏培君.积分方程及其数值解[M].北京:冶金工业出版社,2007.
    [156]牛东晓,曹树华,赵磊,等.电力负荷预测技术及其应用[M].北京:中国电力出版社,1999.
    [157]刘晨晖.电力系统负荷预报理论与实践[M].哈尔滨:尔滨工业大学出版社,1987.
    [158] Wang B, Tai N L, Zhai H Q, et al. A new ARMAX model based on evolutionary algorithm and particle swarm optimization for short-term load forecasting[J]. Electric Power Systems Research, 2008, 78(10): 1679-1685.
    [159] Paras M, Tomonobu S, Naomitsu U. A neural network based several-hour-ahead electric load forecasting using similar days approach[J]. Electrical Power and Energy Systems, 2006, 28(6): 367-373.
    [160] Ulagammaia M, Venkatesha P, Kannan P S, et al. Application of bacterial foraging technique trained artificial and wavelet neural networks in load forecasting[J]. Neurocomputing, 2007, 70(16-18): 2659-2667.
    [161] Lauret P, Fock E, Randrianarivony R N, et al. Bayesian neural network approach to short time load forecasting[J]. Energy Conversion and Management, 2008, 49(5): 1156–1166.
    [162] Xiaoet Z, et al. BP neural network with rough set for short term load forecasting[J]. Expert Systems with Applications, 2009, 36(1): 273-279.
    [163] Ghiassi M, Zimbra D K, Saidane H. Medium term system load forecasting with a dynamic artificial neural network model[J]. Electric Power Systems Research, 2006, 76(5): 302-316.
    [164] González-Romera E, González-Romera E, et al. Monthly electric energy demand forecasting with neural networks[J]. Energ Convers Manage, 2008, 49(11): 3135-3142.
    [165] Zeke S H Chan, Nganb H W, Rad A B, et al. Short-term ANN load forecasting from limited data using generalization learning strategies[J]. Neurocomputing, 2006, 70(1-3): 409-419.
    [166] Saini L M. Peak load forecasting using Bayesian regularization, Resilient and adaptive backpropagation learning based artificial neural networks[J]. Electric Power Systems Research, 2008, 78(7): 1302-1310.
    [167] Hong W C. Electric load forecasting by support vector model[J]. Appl. Math. Model, 2008, doi:10.1016/j.apm.2008.07.010.
    [168]王雪峰,冯英浚.多层神经网络的一种新学习算法[J].哈尔滨工业大学学报,1997,29(2):23-25.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700