用户名: 密码: 验证码:
基于最小二乘支持向量机的在线建模与控制方法研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
论文对最小二乘支持向量机(Least Squares Support Vector Machines, LSSVM)的有关算法及LSSVM在系统在线建模与控制中的若干应用方法展开研究。
     主要研究内容包括:
     在分析现有基于分块矩阵求逆公式的LSSVM增量式和减量式学习算法的基础上,为这些算法设计了一些加速实现策略,得到LSSVM快速在线式学习算法。基于这种快速LSSVM的多输入多输出(Multiple-Input Multiple-Output, MIMO)系统在线建模仿真显示这些加速策略能明显提高原有在线算法执行速度;加速策略需要的空间为少数几个缓存数组。
     为了减少LSSVM基本剪枝算法的计算量,提出了一种快速剪枝算法。在分析剪枝前后两个LSSVM对应线性方程组系数矩阵之间关系的基础上,利用对调单位矩阵两行(或列)所得的初等方阵的逆等于其自身的性质和分块矩阵求逆公式,导出这两个系数矩阵的子阵的逆之间的递推关系,避免剪枝过程中多次进行高阶矩阵求逆,从而减少计算量。在不考虑递推计算误差时,该算法理论上得出与基本剪枝算法相同结果的稀疏LSSVM。仿真实验表明该算法比基本剪枝算法速度快,而且初始训练样本越多,加速比越大。
     为了减少带时间窗型在线LSSVM的计算量和存储空间,提出了一种带时间窗型在线稀疏LSSVM。它利用滑动时间窗中部分时刻的样本作为训练样本集。新时刻的样本总是加入训练样本集;每次删除样本时,若滑动时间窗最前端时刻的样本在训练样本集中,则删除它,否则从训练样本集中选择留一法预测误差最小的样本删除。与现有的在线LSSVM相比,该在线稀疏LSSVM能用较少的样本学习系统较多的特性,能提高时空效率;系统建模仿真实验表明,本文在线稀疏LSSVM能节省时间和空间,具有较高的预测精度。
     针对无偏置最小二乘支持向量机(non-bias LSSVM, NB-LSSVM),给出了删除贡献最小(或任意)样本的计算过程,并设计了稀疏在线无偏置最小二乘支持向量机(sparse online non-bias LSSVM, SONB-LSSVM).该样本删除技巧能提高学习样本集的多样性和代表性;与在线无偏置最小二乘支持向量机(online non-bias LSSVM, ONB-LSSVM)相比,SONB-LSSVM能学习系统更长时间范围内的特性;用于动态系统在线建模时,其泛化能力受输入信号频率影响较小。
     针对直接利用LSSVM对动态过程在线建模时预测精度易受过程输出测量值上的粗大误差和噪声影响的问题,在分析样本序列结构特征和噪声作用特征基础上,提出一种基于NB-LSSVM的稳健在线过程建模方法。该方法在每一预测周期中根据预测误差与设定阈值之间的关系来识别和剔除异常测量值、识别和修正含噪声测量值,从而降低样本中的噪声,使得出的LSSVM较好地跟踪过程的动态特性。这种在线过程建模方法具有稳健性,能减少输出值上粗大误差和高斯白噪声对LSSVM预测精度的影响,提高预测精度。数字仿真显示了本文方法的有效性和优越性。
     利用LSSVM在线辨识时变非线性过程时,设定其核参数较困难,设定的核参数不能适应过程变化而进行自动调节;针对此问题,提出了一种基于核参数分时段调节型LSSVM的在线过程辨识方法。该方法利用三个LSSVM,并将整个建模预测时期分为启动阶段和若干个工作周期,初始阶段末和每个工作周期末选定预测误差累加值最小的LSSVM作为后续工作周期的工作LSSVM,同时根据启发式规则为另两个LSSVM重新设定核参数值,它们作为后续工作周期的比较LSSVM.该方法设定核参数相对容易,而且核参数具有一定的自动调节能力。数字仿真显示,从统计角度而言,本文方法比传统方法有更好的适应性。
     对于非线性系统预测控制问题,提出了一种基于SONB-LSSVM的有约束单步预测控制算法。该预测控制利用SONB-LSSVM在线建模预测系统输出,滚动优化环节采用粒子群优化(Particle Swarm Optimization,PSO)或Brent优化搜索非线性系统的控制量。由于SONB-LSSVM能及时学习过程新动态特性,该预测控制方法具有良好的自适应能力。在控制启动阶段利用比例-积分-微分(proportional-integral-derivative,PID)控制,同时采集过程输入输出数据构成样本逐步生成初始的NB-LSSVM,该控制方法使用方便。液位控制、连续搅拌反应釜浓度控制和水浴温度控制等仿真表明该预测控制方法是有效的。
     探讨了SONB-LSSVM和ONB-LSSVM在非线性逆控制中的应用。提出基于SONB-LSSVM的自适应直接逆控制方法,该方法利用SONB-LSSVM对被控对象在线逆建模并计算控制量,该控制方法适合控制定常或参数(或特性)轻微变化的可逆对象。对于运行中参数(或特性)在较大范围中变化的可逆对象,提出一种ONB-LSSVM逆模型与PID相结合的自适应复合控制,在每个控制周期,控制信号由逆控制量与PID控制量合成,合成比例自动调节。逆控制在高频段发挥较大作用,PID控制在低频段发挥较大作用。因此,该复合控制方法有较宽的频率适应范围,并且对时变系统有较好控制效果。两种控制方法中初始NB-LSSVM在线生成,因此这两种控制方法使用方便。数字仿真表明了这两种控制方法的可行性和有效性。
Some algorithms about least squares support vector machines(LSSVM) and applying methods for system online modeling and control based on LSSVM are studied in this dissertation.
     The main contents are outlined as follows:
     The existing LSSVM incremental and decremental learning algorithms by the formula of solving inversion of partitioned matrix are analyzed, and some speeding up implementing tactics are designed for the algorithms; consequently, a fast online LSSVM learning algorithm is obtained. The numerical simulation of multiple-input multiple-output(MIMO) system online modeling employing the fast online LSSVM shows these speedy tactics can heighten time performance of the existing LSSVM online learning algorithm; the tactics only require space of several additional buffer arrays.
     To reduce the computation amount of basic pruning algorithm(BPA) for LSSVM, a fast pruning algorithm(FPA) is proposed. The connection between two coefficient matrices of linear equations corresponding to LSSVM before pruning and to one after doing is analyzed, and the recursive relation between inversions of sub-matrices of the two coefficient matrices is derived by property of elementary matrix attained through swapping two columns or two rows that its inversion equalling to itself and by the calculation formula of solving inversion of partitioned matrix, therefore repeat calculation of inversion of higher order matrices is avoided in pruning process, and computation amount is decreased. FPA products the same resultant sparse LSSVM as BPA theoretically, when recursive calculating error is neglected. The numerical simulation results show the presented algorithm is quicker than BPA, and the more the training samples, the greater FPA's speedup rate to BPA is.
     To reduce the computation time and storage space of online LSSVM with time window, an online sparse LSSVM with time window is proposed. It only takes samples ranking at partial moments among sliding time window as training samples set(TSS). The new sample is learned necessarily; when sample elimination is performed, if the sample ranking at the oldest moment among sliding time window exists in TSS, then it will be removed during decremental learning, otherwise, the sample with the smallest leave-one-out predicting error among TSS is selected and deleted. Compared with the existing online LSSVM, the presented online sparse LSSVM can learn more characteristic of system using less samples, and heighten time-space efficiency; compared with the existing online spare LSSVM, it can get rid of the obsolete sample, and adapt to time-variant properties of system better. The numerical simulation results for system modeling show the presented online sparse LSSVM can save time and space, provide accurate prediction.
     For non-bias LSSVM(NB-LSSVM), computing process to delete the least important sample or any one is given, and sparse online non-bias LSSVM (SONB-LSSVM) is designed. The skill for deleting sample can improve diversity and representative capacity of the training sample set. Compared with online non-bias LSSVM(ONB-LSSVM), SONB-LSSVM can study system properties in longer time horizon; generalization of SONB-LSSVM is less affected by the input signal frequency when it is employed for dynamic system online modeling.
     Aiming at the problem that predicting accuracy of LSSVM is influenced easily by gross errors and noises overriding on measure value of plant output when LSSVM applied to the dynamic process online modeling directly, a robust online process modeling method using NB-LSSVM is presented after characteristics of samples sequence structure and of noises action are analyzed. Abnormal measure data are recognized and eliminated, and measure data carrying noises are detected and rectified according to relation between predicting error and set threshold value during per predicting period, consequently less noises enter into samples, and obtained online LSSVM can track dynamics of process better. The modeling method is robust, can decrease effect of gross error and Gaussian white noise on LSSVM predicting accuracy and improve predicting accuracy. The numerical simulation shows the validity and advantage of the method.
     To tackle the difficulty in setting the kernel parameter and in adjusting it to varying process employing LSSVM to identify time-varying nonlinear process online, an online process identification approach based on LSSVM using regulated kernel parameter during different term is proposed in this paper. Three LSSVMs are utilized and the whole modeling predicting times is divided into starting stage and working periods, at the end of which the LSSVM with smallest sum of predicting error is selected as working LSSVM for successive working period, and kernel parameters are reset for other two LSSVM according to heuristic rules and they become comparative LSSVMs during the following working period. The method is easy to set kernel parameters and has adjustability to a certain extent. The numerical simulation shows the adaptability of the method is better than that of traditional method statistically.
     For the predictive control of nonlinear systems, a constrained single-step-ahead predictive control(PC) algorithm is proposed utilizing SONB-LSSVM. The presented control algorithm uses SONB-LSSVM to model online and to forecost the plant output value; the control values are obtained by the rolling optimization of particle swarm optimization(PSO) or Brent optimization. During starting stage of control, the control values are calculated through proportional-integral-derivative(PID) controller, and the plant input-output values are acquired to form samples and to train initial NB-LSSVM incrementally, which facilitates the usage of the presented PC. Because SONB-LSSVM can study new dynamic properties of process in time, the predictive control strategy possesses excellent adaptation. Simulation results of liquid-level process control, continuous stirred tank reactor(CSTR) concentration control and temperature control of water tank show the validity of the predictive control algorithm.
     The applications of SONB-LSSVM and ONB-LSSVM in nonlinear inverse control are investigated. An adaptive direct inverse control is proposed utilizing SONB-LSSVM. SONB-LSSVM is used to build the nonlinear inverse model for the controlled object and to compute control value during per control period. The approach is suit for time-invariant ivertible system and one with parameters(or properties) time-varying slightly. For the invertible system with parameters(or properties) time-varying in wide range, a composite control strategy combining ONB-LSSVM inverse model with PID control is designed. During per controlling period, control signal is compounded from inverse control value and PID control value, and compound ratio of them changes automatically. Inverse control plays more important role in high frequency range, and PID control does so in low frequency range. Consequently, the control approach possesses wide frequency adaptation range and good control performances to time-varying systems. In both methods the initial NB-LSSVM is trained online, which facilitates the usage of the proposals. Simulation results indicate the feasibility and validity of the control methods.
引文
[1]Cameron I, Ingram G. A survey of industrial process modelling across the product and process lifecycle. Computers & Chemical Engineering,2008, 32(3):420-438
    [2]Henson M A. Nonlinear model predictive control:current status and future directions. Computers & Chemical Engineering,1998,23(2):187-202
    [3]李丽娟.最小二乘支持向量机建模及预测控制算法研究:[浙江大学博士学位论文].杭州:浙江大学信息科学与工程学院,2008
    [4]阎平凡,张长水.人工神经网络与模拟进化计算.北京:清华大学出版社2000
    [5]王立新.自适应模糊系统与控制——设计与稳定性分析.北京:国防工业出版社,1995
    [6]Vapnik V N. The Nature of Statistical Learning Theory. New York:Springer Verlag,1995
    [7]Vapnik V N. Statistical learning theory. New York:Wiley,1998
    [8]Suykens J A K. Least squares support vector machine classifiers. Neural Process Letter,1999,9(3):293-299
    [9]Suykens J A K. Nonlinear modeling and support vector machines. In:IEEE Instrumentation and Measurement Technology Conference, Budapest, Hungary, 2001,287-294
    [10]Suykens J A K, Van Gestel T, De Brabanter J, et al. Least squares support vector machines. Singapore:World Scientific Pub Co Inc,2002
    [11]Liu J H, Chen J P, Shan J, et al. Online Is-svm for function estimation and classification. Journal of University of Science and Technology Beijing,2003, 10(5):73-77
    [12]张浩然.支持向量机算法及应用研究:[上海交通大学博士学位论文].上海:上海交通大学自动化系,2003,77-91
    [13]张浩然,汪晓东.回归最小二乘支持向量机的增量和在线式学习算法.计算机学报,2006,29(3):400-406
    [14]边肇祺,张学工.模式识别(第二版).北京:清华大学出版社,2000
    [15]张学工.关于统计学习理论与支持向量机.自动化学报,2000,26(1):32-41
    [16]Nello C, John S T.支持向量机导论.李国正,王猛,曾华军(译).北京: 电子工业出版社,2005
    [17]程学云,吉根林,徐慧.支持向量机及其改进算法研究.信息技术,2006,(10):5-9,38
    [18]Scholkopf B, Smola A, Williamson R, et al. New support vector algorithms. Neural Computation,2000,12(5):1207-1245
    [19]彭新俊,王翼飞.总间隔v-支持向量机及其几何问题.模式识别与人工智能,2009,22(1):8-16
    [20]Yoon M, Yun Y, Nakayama H. A role of total margin in support vector machines. In:Proceeding of the International Joint Conference on Neural Network, Portland, USA,2003,3:2049-2053
    [21]Lee Y J, Mangasarian O L. SSVM:a smooth support vector machine for classification. Computational Optimization and Application,2001,20(1):5-22
    [22]Lee Y J, Hsieh W F, Huang C M. ε-SSVR:a smooth support vector machine for ε-insensitive regression. IEEE Transactions on Knowledge and Data Engineering,2005,17(5):678-685
    [23]刘叶青,刘三阳,谷明涛.光滑支持向量机多项式函数的研究.2009,31(6):1450-1453
    [24]任斌,程良伦.多项式光滑的支持向量回归机.控制理论与应用,2011,28(2):261-265
    [25]Huang H P, Liu Y H. Fuzzy support vector machines for pattern recognition and data mining. International Journal of Fuzzy Systems,2002,4(3):826-835
    [26]Lin C F, Wang S D. Fuzzy support vector machine. IEEE Transaction on Neural Networks,2002,13(2):464-471
    [27]Jiang X F, Yi Z, Lv J C. Fuzzy svm with a new fuzzy membership function. Neural Computing and Applications,2006,15:268-276
    [28]Tay F E H, Cao LJ. Modified support vector machines in financial time series forecasting. Neurocomputing,2002,48(1-4):847-861
    [29]Feng R, Zhang Y J, Zhang Y Z, et al. Drifting modeling method using weighted support vector machines with application to soft sensor. Acta Automatica Sinica,2004,30(5):436-441
    [30]张讲社,郭高.加权稳健支撑向量回归方法.计算机学报,2005,28(7):1171-1177
    [31]范听炜,杜树新,吴铁军.可补偿类别差异的加权支持向量机算法.中国图像图形学报,2003,8(9):1037-1042
    [32]Chen C C, Su S S. Robust support vector regression networks for function approximation with outliers. IEEE transactions on Neural Networks,2002, 13(6):1322-1329
    [33]陈晓峰,王士同,曹苏群.自适应误差惩罚支撑向量回归机.电子与信息学报,2008,30(2):367-370
    [34]Song Q, Hu W J, Xie W F. Robust support Vector machine with bullet hole image classification. IEEE transactions on systems, man and cybernetics, Part C, Applications and reviews.2002,32(4):440-448
    [35]Mangasarian O L, Musicant D R. Lagrangian Support Vector Machines. Journal of Machine Learning Research,2001,1(3):161-177
    [36]Yang X W, Shu L, Hao Z F, et al. An extended Lagrangian support vector machine for classifications, progress in natural science,2004,14(6):519-523
    [37]Mangasarian O L, Musicant D R. Active support vector machine classification. In:Advances in Neural Information Processing Systems, MIT Press,2001,577-583
    [38]Musicant D R, Feinberg A. Active set support vector regression. IEEE Transactions on Neural Networks,2004,15(2):268-275
    [39]周水生,周利华.共轭梯度型支撑向量机.模式识别与人工智能,2006,19(2):129-136
    [40]Zhang L, Zhou W D, Jiao L C. Hidden space support vector machines. IEEE Transactions on Neural Networks,2004,15(6):1424-1434
    [41]王玲,薄列峰,刘芳.稀疏隐空间支持向量机.西安电子科技大学学报(自然科学版),2006,33(6):896-901
    [42]杨智明,彭宇,彭喜元.基于支持向量机的不平衡数据集分类方法研究.仪器仪表学报,2009,30(5):1094-1099
    [43]吴洪兴,彭宇,彭喜元.适用于不平衡样本数据处理的支持向量机方法.电子学报,2006,34(12A):2395-2398
    [44]刘万里,刘三阳,薛贞霞.不平衡支持向量机的平衡方法.模式识别与人工智能,2008,21(2):136-141
    [45]Zafeiriou S, Tefas A, Pitas Ⅰ. Minimum class variance support vector machines. IEEE Transactions on Image Processing,2007,16(10):2551-2564
    [46]王晓明,王士同.最小方差支撑向量回归.控制与决策,2010,25(4):556-561
    [47]文传军,詹永照,陈长军.最大间隔最小体积球形支持向量机.控制与决策,2010,25(1):79-83
    [48]汪廷华,田盛丰,黄厚宽.特征加权支持向量机.电子与信息学报,2009,31(3):514-518
    [49]王书舟,伞冶.支持向量机的训练算法综述.智能系统学报,2008,3(6):467-475
    [50]文益民,王耀南,吕宝粮,等.支持向量机处理大规模问题算法综述.计算机科学,2009,36(7):20-25,31
    [51]Galmeanu H, Andonie R. Incremental/decremental svm for function approximation. In:11th International Conference on Optimization of Electrical and Electronic Equipment,2008,155-160
    [52]王凌云,桂卫华,刘梅花,等.基于改进在线支持向量回归的离子浓度预测模型.控制与决策,2009,24(4):537-541
    [53]Liang Z Z, Li Y F. Incremental support vector machine learning in the primal and applications. Neurocomputing,2009,72:2249-2258
    [54]Orabona F, Castellini C, Caputo B, et al. On-line independent support vector machines. Pattern Recognition,2010,43:1402-1412
    [55]胡广浩,毛志忠,何大阔.基于矢量基学习的浸出过程在线建模.控制与决策,2011,26(4):629-632
    [56]彭新俊,王翼飞.基于CCH的SVM几何算法及其应用.应用数学和力学,2009,30(1):90-100
    [57]刘振丙,陈忠,刘建国.一种新的构造SVM分类器的几何最近点法.自动化学报,2010,36(6):791-797
    [58]Chappelle O, Vapnik V, Bousquet O, et al. Choosing multiple parameters for support vector machines. Machine Learning,2002,46(1):131-159
    [59]Duan K, Keerthi S S, Poo A N. Evaluation of simple performance measures for tuning SVM Hyperparameters. Neurocomputing,2003,51(4):41-59
    [60]Ayat N E, Cheriet M, Suen C Y. Automatic model selection for the optimization of SVM kernels. Pattern Recognition,2005,38(10):1733-1745
    [61]Li S T, Tan M K. Tuning SVM parameters by using a hybrid CLPSO-BFGS algorithm. Neurocomputing,2010,73:2089-2096
    [62]刘向东,骆斌,陈兆乾.支持向量机最优模型选择的研究.计算机研究与发展,2005.42(4):576-581
    [63]唐耀华,郭为民,高静怀.基于核相似性差异最大化的支持向量机参数选择算法.模式识别与人工智能,2010,23(2):210-215
    [64]郭一楠,程健,杨梅.支持向量回归超参数的混沌文化优化选择方法.控制与决策,2010,25(4):525-530
    [65]刘昌平,范明钰,王光卫,等.基于梯度算法的支持向量机参数优化方法.控制与决策,2008,23(11):1291-1295,1300
    [66]Cherkassky V, Ma Y Q. Practical selection of SVM parameters and noise estimation for SVM regression. Neural Networks,2004,17:113-126
    [67]孙德山.支持向量机分类与回归算法的关系研究.计算机应用与软件,2008,25(2):84-85
    [68]Fung G, Mangasarian O L. Proximal support vector machine classifiers. In: Proceedings KDD-2001, knowledge discovery and data mining, San Francisco, 2001,77-86
    [69]杜喆,刘三阳.最小二乘支持向量机变型算法研究.西安电子科技大学学报,2009,36(2):331-337,372
    [70]蔡艳宁,胡昌华.一种基于Cholesky分解的动态无偏LS-SVM学习算法.控制与决策,2008,23(12):1363-1367
    [71]Wang H Q, Sun F C, Cai Y N, et al. An unbiased LSSVM model for classification and regression. Soft Computing,2010,14(2):171-180
    [72]Peng X J, Wang Y F. A normal least squares support vector machine (NLS-SVM) and its learning algorithm. Neurocomputing,2009,72(16-18): 3734-3741
    [73]Suykens J A K, Vandewalle J. Recurrent least squares support vector machines. IEEE transactions on Circuits Systems — Ⅰ,2000,47(7): 1109-1114.
    [74]Suykens J A K, Lukas L, Van Dooren P, et al. Least squares support vector machine classifiers:A large scale algorithm. In:Proceedings of the European Conference on Circuit Theory and Design (ECCTD'99),1999,839-842
    [75]Keerthi S S, Shevade S K. SMO algorithm for least squares svm formulations. Neural Computation,2003,15(2):487-507
    [76]Chu W, Ong C J, Keerthi S S, et al. An improved conjugate gradient scheme to the solution of least squares svm. IEEE transactions on neural networks, 2005,16(2):493-501
    [77]Bo L F, Jiao L C, Wang L. Working set selection using functional gain for ls-svm. IEEE Transactions on Neural Networks,2007,18(5):1541-1544
    [78]吴青,刘三阳,张乐友.最小二乘支持向量机的预优共轭梯度法.系统工程与电子技术,2007,29(10):1746-1748
    [79]邢永忠.最小二乘支持向量机的若干问题与应用研究:[南京理工大学博士学位论文].南京:南京理工大学自动化学院,2009
    [80]Chua K S. Efficient computations for large least square support vector machine classifiers. Pattern Recognition Letters,2003,24(1-3):75-80
    [81]杜喆,刘三阳.直接支持向量机,控制与决策,2008,23(8):935-937,943
    [82]瞿海妮,许维胜,Dreyfus G.基于递归最小二乘支持向量机的动态系统建模研究.控制与决策,2009,24(11):1163-1167,1172
    [83]Chi H M, Erosy O K. Recursive Update Algorithm for Least Squares Support Veetor Machines. Neural Proeessing Letters,2003,17:165-173
    [84]姜静清.最小二乘支持向量机算法及应用研究:[吉林大学博士学位论文].吉林:吉林大学计算机科学技术学院,2007,31-35
    [85]Suykens J A K, Lukas L, Vandewalle J. Sparse approximation using least square vector machines. In:Proceeding of the IEEE International Symposium on Circuits and Systems, Geneva, Switzerland, Geneva:IEEE,2000,2: 757-760
    [86]Suykens J A K, Lukas L, Vandewalle J. Sparse least squares support vector machine classifiers. In:Proceedings of European Symposium on Artificial Neural Networks Bruges (Belgium), D-Facto public,2000,37-42
    [87]De Kruif B J, De Vries T J A. Pruning error minimization in least squares support vector machines. IEEE Transactions on Neural Networks,2004,14(3): 696-702
    [88]Kuh A, Wilde P D. Comments on "pruning errorminimization in least squares support vectormachines". IEEE Transactions on Neural Networks,2007,18 (2): 606-609
    [89]Zeng X Y, Chen X W. SMO-based pruning methods for sparse least squares support vector machines. IEEE Transactions on Neural Networks,2005,16(6): 1541-1546
    [90]Jiao L C, Bo L F, Wang L. Fast Sparse Approximation for Least Squares Support Vector Machine. IEEE Transactions on Neural Networks,2007,18(3): 685-697
    [91]Zhao Y P, Sun J G. Recursive reduced least squares support vector regression. Pattern Recognition,2009,42:837-842
    [92]Hoegaerts L, Suykens J A K, Vandewalle J, et al. A comparison of pruning algorithms for sparse least squares support vectormachines. In:Proceedings of the 2004 International Conference on Neural Information Processing, Calcutta, India,2004,3316:1247-1253
    [93]Baudat G, Anouar F. Kernel based methods and function approximation. In: International Joint Conference on Neural Networks,2001,1244-1249
    [94]Yaakov E, Shie M, Ron M. Sparse online greedy support vector regression. In: Proceeding of the 13th European Conference on Machine Learning, Berlin: Springer-Verlag,2002:84-96
    [95]解应春,王海清,李平.矢量基学习算法及在辨识建模中的应用研究.电路与系统学报,2004,9(6):122-126
    [96]Csatol L, Opper M. Sparse representation for gaussian proeess models. In: Proceeding of NIPS,2001,13:251-57
    [97]Csatol L, Opper M. Sparse online gaussian process. Neural Computation,2002, 14(3):641-669
    [98]Cawley G C, Talbot N L C. Improved sparse least-squares support vector machines. Neurocomputing,2002,48(1-4):1025-1031
    [99]陈爱军,宋执环,李平.基于矢量基学习的最小二乘支持向量机建模.控制理论与应用,2007,24(1):1-5
    [100]王定成,姜斌.在线稀疏最小二乘支持向量机回归的研究.控制与决策,2007,22(2):132-137
    [101]Gan L Z, Liu H K, Sun Y X. Sparse least squares support vector machine for function estimation. Advances in Neural Networks-ISNN 2006, Lecture Notes in Computer Science-LNCS 3971,2006,1016-1021
    [102]吴春国.广义染色体遗传算法与迭代式最小二乘支持向量机回归算法研究:[吉林大学博士学位论文].长春:吉林大学计算机科学与技术学院,2006
    [103]Espinoza M, Suykens J A K, Moor B. Fixed-size least squares support vector machines:a large scale application in electrical load forecasting. Computational Management Science,2006,3(2):113-129
    [104]陶少辉,陈德钊,胡望明.最小二乘支持向量机分类器的高稀疏化及应用.系统工程与电子技术,2007,29(8):1353-1357
    [105]吴宗亮,窦衡.一种新的最小二乘支持向量机稀疏化算法.计算机应用,2009,29(6):1559-1562,1581
    [106]赵永平,孙健国.关于稀疏最小二乘支持向量回归机的改进剪枝算法.系统工程理论与实践,2009,29(6):166-171
    [107]Li Y G, Lin C, Zhang W D, Improved sparse least-squares support vector machine classifiers. Neurocomputing,2006,69(13-15):1655-1658
    [108]唐和生,薛松涛,陈镕,等.序贯最小二乘支持向量机的结构系统识别.振动工程学报,2006,19(3):382-387
    [109]刘毅,陈坤,王海清,等.选择性递推LSSVR及其在过程建模中的应用.高校化学工程学报,2008,6(22):1043-1048
    [110]Li L J, Su H Y, Chu J. Sparse representation based on projection method in online least squares support vector machines. Journal of Control Theory and Applications,2009,7(2):163-168
    [111]Li L J, Su H Y, Chu J. Modeling of isomerization of Cg aromatics by online least squares support vector machine. Chinese Journal of Chemical Engineering,2009,17(3):437-444
    [112]Suykens J A K, De Brabanter J, Lukas L, et al. Weighted Least Squares Support Vector Machines:Robustness and Sparse Approximation. Neurocomputing, 2002,48(124):85-105
    [113]陈爱军.最小二乘支持向量机及其在工业过程建模中的应用:[浙江大学博士学位论文].杭州:浙江大学工业控制技术研究所,2006
    [114]邢永忠,吴晓蓓,徐志良.基于柯西分布加权的最小二乘支持向量机.控制与决策,2009,24(6):937-940
    [115]Wen W, Hao Z F, Shao Z F, et al. A heuristic weight-setting algorithm for robust weighted least squares support vector regression. In:Proc of the 13th International Conference on Neural Information Processing, Hongkong, China, 2006,773-781
    [116]Wen W, Hao Z F, Yang X W. A heuristic weight-setting strategy and iteratively updating algorithm for weighted least squares support vector regression. Neurocomputing,2008,71:3096-3103
    [117]张英,苏宏业,褚健.基于模糊最小二乘支持向量机的软测量建模.控制与决策,2005,20(6):621-624
    [118]吴青,刘三阳,杜喆.回归型模糊最小二乘支持向量机.西安电子科技大学学报(自然科学版),2007,34(5):773-778
    [119]宋海鹰,桂卫华,阳春华.模糊偏最小二乘支持向量机的应用研究.系统仿真学报,2008,20(5):1344-1352
    [120]Li L, Su H, Chu J. Least squares support vector machines based on support vector degrees. Lectur Notes in Computer Science, Chongqing, China: Springer,2006,1275-1281
    [121]吕剑峰,戴连奎.加权最小二乘支持向量机改进算法及其在光谱定量分析中的应用.分析化学,2007,35(3):340-344
    [122]Cui W T, Yan X F. Adaptive weighted least square support vector machine regression integrated with outlier detection and its application in QSAR. Chemometrics and Intelligent Laboratory Systems,2009,98(2): 130-135
    [123]包鑫,戴连奎.加权最小二乘支持向量机稳健化迭代算法及其在光谱分析中 的应用.化学学报,2009,67(10):1081-1086
    [124]温雯,郝志峰,杨晓伟,等.基于假设检验及异常点剔除的稳健LS-SVM回归.模式识别与人工智能,2010,23(2):241-249
    [125]张淑宁,王福利,尤富强,等.基于鲁棒学习的最小二乘支持向量机及其应用.控制与决策,2010,25(8):1169-1172,1177
    [126]范玉刚,李平,宋执环.动态加权最小二乘支持向量机.控制与决策,2006,21(10):1129-1133
    [127]赵永平,孙健国.基于滚动窗法最小二乘支持向量机的稳健预测模型.模式识别与人工智能,2008,21(1):1-5
    [128]朱家元,杨云,张恒喜,等.支持向量机的多层动态自适应参数优化.控制与决策,2004,19(2):223-225,229
    [129]Guo X C, Yang J H, Wu C G, et al. A novel LS-SVMs hyper-parameter selection based on particle swarm optimization. Neurocomputing,2008,71 (16):3211-3215
    [130]Yu L, Chen H H, Wang S Y, et al. Evolving least squares support vector machines for stock market trend mining. IEEE Transactions on Evolutionary Computation,2009,13(1):87-102
    [131]陈治明,罗飞,黄晓红,等.基于混沌优化支持向量机的轧制力预测.控制与决策,2009,24(6):808-812
    [132]张展羽,陈子平,王斌,等.基于自由搜索的LS-SVM在墒情预测中的应用.系统工程理论与实践,2010,30(2):201-206
    [133]Zhao Y, Kwoh C K. Fast leave-one-out evaluation and improvement on inference for LS-SVMs. In:17th International Conference on Pattern Recognition, Cambridge, UK:IEEE Computer Society,2004,494-497
    [134]Cawley G C, Talbot N L C. Preventing over-fitting during model selection via Bayesian regularisation of the hyper-parameters. Journal of Machine Learning Research,2007,8(4):841-861
    [135]陶少辉,陈德钊,胡望明.LSSVM过程建模中超参数选取的梯度优化算法.化工学报,2007,58(6):1514-1517
    [136]张晓平,赵珺,王伟,等.基于最小二乘支持向量机的焦炉煤气柜位预测模型及应用.控制与决策,2010,23(11):1291-1295,1300
    [137]Adankon M M, Cheriet M. Model selection for the LS-SVM Application to handwriting recognition. Pattern Recognition,2009,42(12):3264-3270
    [138]Suykens J A K, Vandewalle J. Chaos control using least-squares support vector machines. International Journal of circuit theory and applications,1999,27: 605-615
    [139]Suykens J A K, Vandewalle J, Moor B D. Optimal control by least squares support vector machines. Neural Networks,2001,14(1):23-35
    [140]刘斌,苏宏业,褚健.一种基于最小二乘支持向量机的预测控制算法.控制与决策,2004,19(12):1399-1402
    [141]Li X, Cao G Y, Zhu X J. Modeling and control of PEMFC based on least squares support vector machines. Energy Conversion and Management,2006, 47(7-8):1032-1050
    [142]Li L J, Su H Y, Chu J. Generalized predictive control with online least squares support vector machines. Acta Automatica Sinica,2007,33(11):1182-1188
    [143]张日东,王树青,李平.基于支持向量机的非线性系统预测控制.自动化学报,2007,33(10):1066-1073
    [144]刘毅,王海清,李平.采用Brent优化的核学习单步预测控制算法.控制理论与应用,2009,26(1):107-110
    [145]穆朝絮,张瑞民,孙长银.基于粒子群优化的非线性系统最小二乘支持向量机预测控制方法.控制理论与应用,2010,27(2):164-168
    [146]王娟,刘明治.蚁群算法滚动优化的LS-SVM预测控制研究.控制与决策,2009,24(7):1087-1091
    [147]郭振凯,宋召青,毛剑琴.基于最小二乘支持向量机的非线性广义预测控制.控制与决策,2009,24(4):520-525
    [148]宋夫华,李平.支持向量机α阶逆系统控制—离散非线性系统.浙江大学学报(工学版),2006,40(12):2098-2102
    [149]黄银蓉,张绍德,季民.一种基于LS-SVM与PID复合的逆控制系统.机电工程,2010,27(2):75-78
    [150]胡良谋,曹克强,李小刚,等.基于LS-SVM的非线性系统直接逆模型控制.中国机械工程,2010,21(13):1553-1556
    [151]刘陆洲,肖建.基于支持向量机的逆控制及其稳定性分析.计算机应用,2008,28(11):2978-2980
    [152]沈曙光,王广军,陈红.最小支持向量机在系统逆动力学辨识与控制中的应用.中国电机工程学报,2008,28(5):85-89
    [153]谢春利,邵诚,赵丹丹.基于最小二乘支持向量机的非线性系统鲁棒自适应跟踪控制.信息与控制,2009,39(1):66-70,76
    [154]程启明,杜许峰,郭瑞青,等.基于最小二乘支持向量机的多变量逆系统控制方法及应用.中国电机工程学报,2008,28(35):96-101
    [155]孙玉坤,朱志莹.三自由度混合磁轴承最小二乘向量机逆模辨识与解耦控 制.中国电机工程学报,2010,30(15):112-117
    [156]王定成,姜斌.非线性不确定系统的OS-LSSVMR内模控制.控制理论与应用,2008,25(5):905-907
    [157]Wang Y N, Yuan X F. SVM approximate-based internal model control strategy. Acta Automatica Sinica,2008,34(2):172-179
    [158]徐桂云,刘小平,刘云楷,等.基于在线鲁棒LSSVM的自适应PID算法.中国矿业大学学报,2010,39(2):190-195
    [159]谢春利,邵诚,赵丹丹.一类非线性系统基于最小二乘支持向量机的直接自适应控制.控制与决策,2010,25(8):1261-1264,1268
    [160]邹经湘,于开平,杨炳渊.时变结构的参数识别方法.力学进展,2000,30(3):370-377
    [161]丁锋,丁韬,萧德云,等.时变系统有限数据窗最小二乘辨识的有界收敛性.自动化学报,2002,28(5):754-761
    [162]毛汉清.可逆矩阵的分块求逆方法研究.上海铁道学院学报,1994,15(3):110-117
    [163]蔡艳宁,胡昌华.辨识非线性MIMO系统的多输出s-SVR模型研究.控制与决策,2008,23(7):813-816,822
    [164]Cauwenberghs G, Poggio T. Incremental and decremental support vector machine learning. Advance in Neural Information Processing Systems (NIPS*2000), Cambridge MA:MIT Press,2001,13:409-415
    [165]张贤达.矩阵分析与应用.北京:清华大学出版社,2004
    [166]谭永红.多层前向神经网络的RLS训练算法及其在辨识中的应用.控制理论与应用,1994,11(5):594-599
    [167]Seeger M. Low rank updates for the Cholesky decomposition. Tuebingen:Max Planck Society,2005
    [168]Gill P E, Golub G H, Murray W, et al. Methods for modifying matrix factorizations. Mathematics of Computation,1974,126(28):505-535
    [169]Lu C H, Tsai C C. Generalized predictive control using recurrent fuzzy neural networks for industrial processes. Journal of Process Control,2007,17(1): 83-92
    [170]席裕庚.预测控制.北京:国防工业出版社,1993
    [171]钱积新,赵均,徐祖华.预测控制.北京:化学工业出版社,2007
    [172]刘福才,王哲,温淑焕.广义预测控制快速算法研究及其应用.武汉理工大学学报,2009,31(8):109-112,140
    [173]董娜,陈增强,孙青林,等.基于粒子群优化的有约束模型预测控制器.控 制理论与应用,2009,26(9):965-969
    [174]席裕庚,李德伟.预测控制定性综合理论的基本思路和研究现状.自动化学报,2008,34(10):1225-1234
    [175]许力,蒋静坪.CSTR系统的基于CMFC神经元网络的学习控制研究.控制与决策,1992,7(2):131-136
    [176]张妍妍.连续搅拌反应釜的智能控制器设计:[郑州大学硕士学位论文].郑州:郑州大学电气工程学院,2007
    [177]林海军,滕召胜,杨圣洁,等.智能粘度仪水浴温度复合模糊控制方法.湖南大学学报(自然科学版),2008,35(11):44-48
    [178]杨世铭,陶文铨.传热学(第四版).北京:高等教育出版社,2006
    [179]华江电力设备有限公司.镍铬丝主要技术性能.http://www.jshjdl.com/ dianresi/niegesi/2080-niegesi.html,2011-12-28
    [180]辽宁中大超导材料有限公司.氧化镁晶体.http://zdcd.minmetals.com.cn/ productsMgO.html,2011-12-28
    [181]李春文,冯元琨.多变量非线性控制的逆系统方法.北京:清华大学出版社,1991
    [182]戴先中.多变量非线性系统的神经网络逆控制方法.北京:科学出版社,2005
    [183]戴先中,刘军.神经网络α阶逆系统的结构、辨识及其在控制中的应用.电力系统自动化,1997,21(7):1-4,14
    [184]Ma X M. Inverse identification and closed-loop control of dynamic systems using neural networks. Control Theory and Application,1997,14(6):829-836
    [185]陈小红,高峰,钱积新.基于径基函数神经网络的精馏塔自适应控制.控制理论与应用,1998,15(2):226-231
    [186]党映农,韩崇昭.基于改进型Volterra基函数网络的直接自适应逆控制方法.控制与决策,2001,16(5):633-636
    [187]Yuan X F, Wang Y N, Sun W, et al. RBF networks-based adaptive inverse model control system for electronic throttle. IEEE Transactions on Control Systems Technology,2010,18(3):750-756
    [188]张宇明,曹其新.一种时变非线性系统的自适应逆控制仿真.系统仿真学报,2006,18(3):760-763
    [189]张腾飞,李云.基于粗糙-神经网络的非线性系统逆模型控制.仪器仪表学报,2009,30(8):1726-1733
    [190]刘士荣,俞金寿.神经模糊逆模/PID复合控制在CSTR中的应用.控制理论与应用,2001,18(5):769-773
    [191]刘福才,张艳欣,王亚静,等.一种基于逆模糊模型的自适应逆控制方法.仪 器仪表学报,2010,31(5):961-967
    [192]袁小芳,王耀南,杨辉前.基于支持向量机的非线性逆控制及仿真研究.湖南大学学报(自然科学版),2006,33(1):71-74
    [193]Wang H, Pi D Y, Sun Y X. Online SVM regression algorithm-based adaptive inverse control. Neurocomputing,2007,70(4-6):952-959
    [194]Widrow B, Walach E.自适应逆控制.刘树棠,韩崇昭(译).西安:西安交通大学出版社,2000
    [195]Kim B S, Calise A J. Nonlinear flight control using neural networks. Journal of Guidance, Control, and Dynamics,1997,20(1):26-33
    [196]朱家强,朱纪洪,郭锁凤,等.基于神经网络的鲁棒自适应逆飞行控制.控制理论与应用,2005,22(2):182-188
    [197]Kawato M. Feedback-error-learning neural network for supervised motor learning. Advanced Neural Computer, Elsevier,1990,365-372
    [198]韩华,罗安.一种基于DTFEL的非线性自适应逆控制.控制与决策,2008,23(11):1315-1320
    [199]柳晓菁,易建强,赵冬斌,等.一种基于RBF网络的非线性自适应逆控制系统.控制与决策,2004,19(10):1175-1176,1182

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700