用户名: 密码: 验证码:
基于支持向量机的主动元建模方法研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
随着科学技术的发展及仿真理论的成熟,诸如高层推理、决策支持以及探索性分析等领域的研究对象已经从系统范畴提升到体系范畴,研究问题越来越复杂。由于涉及实体众多、交互关系复杂、不确定性显著,复杂高层决策问题通常具有复杂非线性、多层次多类型及有限时间获取数据有限特性,这些特性使得计算的复杂性和分析的复杂性愈发显著,客观上要求构建简单的低分辨率模型。
     近年来,面向高层决策问题的元建模方法研究已成为M&S领域的一个重要研究内容,旨在提高模型的可理解性和可解释性的主动元建模方法一经提出,就立刻成为元建模领域内的研究热点。然而,研究现状的分析表明,包括RAND公司的主动元建模方法和支持向量回归在内的诸多结合先验知识的回归方法,在应对复杂高层决策问题的三类特征:结构复杂性、知识多样性、样本有限性时各具优劣,但无法同时有效应对具有这三类特征的问题,因此研究一种新的主动元建模方法成为M&S领域迫在眉睫的需求。
     本文围绕解决“计算的复杂性”和“分析的复杂性”的主题展开研究,提出一种以传统主动元建模思想为核心、以支持向量回归为关键技术的新的主动元建模实现思路基于支持向量机的主动元建模方法。该方法在宏观上保留了传统主动元建模方法对模型结构设计的自顶向下的指导能力,在微观上保留了支持向量回归方法能够逼近任意阶的非线性函数且适于结合多类先验知识的自底向上的机器学习能力,同时克服了这两类方法相应的不足。为此论文对该方法相关的理论问题进行了系统研究与工程实践,具体内容包括:
     (1)系统地分析研究了基于支持向量机的主动元建模方法的必要性和可行性,给出其形式化定义;分类并总结了建模所需的各类先验知识,将模型结构设计分为核函数设计和约束设计两个部分,并在此基础上提出了以结构设计、仿真实验、模型生成、评估和确认、模型应用为核心的建模框架。
     (2)针对基于支持向量机的主动元建模方法的核函数设计问题展开研究,具体内容包括:1)针对现有核函数“一视同仁”地处理所有输入维度的缺陷以及应用中可能事先已知某些结构型知识的实际情况,提出基于结构型领域知识的核函数构造方法,研究基于直和和张量积构造内涵结构知识的再生核,并给出基于该方法的系统化建模步骤;2)针对现有方法较难处理导数知识的问题以及利用再生核与微分算子之间的相应关系,提出基于非结构型领域知识的核函数构造方法,研究基于微分算子的Green函数的再生核计算方法,包括基于任意m阶微分算子、具有m个互异特征根的微分算子和更简单微分算子等三类情形的计算方法,并给出基于该方法的再生核自适应计算方法;3)针对最优核函数选择(现有文献一般选用高斯核)缺乏指导策略的问题,提出基于静态想定知识的核函数选择方法,通过对仿真问题特征的量化,分析比较高斯核在内的多个核函数,得出高斯核在处理输入维度较高(大于5)且非线性程度较高或输入维度大于10的问题时,其性能不如再生核的结论,并总结出具备一般性的最优核函数选择策略。
     (3)针对基于支持向量机的主动元建模方法的约束设计和模型训练展开研究,具体内容包括:针对标准支持向量回归算法面临参数选择困难、应对可变数据集时泛化能力有限、引入和评价先验知识较难等难题,提出了一种基于先验知识的自适应支持向量回归算法,分析了该算法基本框架及流程,并详细论述各个模块的作用和实现过程;考虑多类先验知识在训练过程中的作用和特点,提出了基于确定型约束和随机性约束的两类规划问题设计求解方法,并将改进的遗传算法与支持向量回归进行扩展结合,在宏观上解决了参数选择和知识评价问题,并在分析快速训练算法和精确解增量算法的基础上,提出一种近似增量训练算法。
     (4)针对高层决策分析的应用需求,以导弹攻防对抗仿真中蓝方雷达探测系统为应用实例,通过构建雷达发现概率的主动元模型,在宏观上演示了基于支持向量机的主动元建模方法的建模过程,并在微观上验证了基于确定型约束的规划求解方法的有效性;通过构建雷达首次稳定跟踪目标时间的主动元模型,验证了基于随机型约束的规划求解方法的有效性。
     论文的主要创新包括:提出了一种基于支持向量机的主动元建模方法,将RAND公司的主动元建模方法和支持向量回归方法有机结合取长补短,完整地给出了该方法形式化定义及建模框架;针对不同应用需求和先验知识类型,提出了三类具体的核函数设计方法;针对参数选择问题、泛化能力问题和知识评价问题,提出一种基于先验知识的自适应支持向量回归算法,它利用改进的遗传算法在宏观上解决参数选择和知识评价问题,同时利用近似增量训练算法,在微观上解决了应对可变训练集的泛化能力问题。
     论文的研究属于M&S和机器学习领域的基础性研究,不但丰富和发展了系统建模仿真方法学,对于推动多分辨率建模理论、机器学习、数据挖掘及试验设计理论在复杂高层决策分析中的研究具有重要意义。
The development of simulation theory and technology has made the research ob-jects in some important fields, such as high-level reasoning, decision-making support andexploratory analysis, upgrad from system level to system-of-systems level and becomemore and more complicated. The problems in high-level decision-making usually pos-sess some characters, e.g. complex nonlinearity, multi-levels and multi-types and limiteddata in limited time, since there are numerous entities, complicated interactive relationsand uncertain effect process. These characters make the complex in computation andanalysis more remarkable and require the development of simpler low-resolution models.
     Recently, the study on metamodelling for high-level decision-making has become animportant research area. The motivated metamodelling, which primely aims at improv-ing the comprehensibility and interpretability, has exhibited to be the research hotspot.However, the analysis of current relevant research works shows that none of the regres-sion approaches combined with prior knowledge, including the motivated metamodellingin RAND and support vector regression, has the full ability to deal with the problems indecision-making with the main three characters. Thus there are cravings in studying anew motivated metamodelling method for M&S.
     This dissertation is focused on solving the”complex in computation”and”complexin analysis”. A new motivated metamodelling method, which combined with classicalmotivated metamodelling and support vector regression, is proposed based on the charac-ters of the problems in decision-making and the analysis results about the existing regres-sion methods. Consequently, this dissertation expands its content in several correlativedirections, including theory of the new method, design methods of kernel function basedon prior knowledge, adaptive regression algorithm, etc. The original works include:
     (1) Several key theoretic problems for the motivated metamodelling based on sup-port vector machines have been researched systemically. After analyzing the feasibilityand necessity of the new method, the dissertation presents a formal theoretic framework,classifies and summarizes the prior knowledge systemically. Based on the summary, thestructural design of metamodel is divided into two important parts, i.e. the design of thekernel function and the restrictions, and a new modelling framework, which consists offive sub-frameworks including structural design, simulation experiment, model genera- tion, evaluation and validation, application, is proposed.
     (2) The second part is focused on the problem of designing kernel function, whoseworks include: 1) According to the disadvantage of the conventional kernel functions,which perform an equal treatment for all the input dimensions, and the advantage of theaforehand presence of the structural knowledge, a new kernel construction method basedon structural prior knowledge is presented. This method has studied the constructionof reproducing kernel based on direct sum and senor product with prior knowledge andthe systematic modeling process. 2) According to the disadvantage on using derivativeknowledge and the theoretic results on reproducing kernel and differential operator, an-other kernel construction method based on non-structural prior knowledge is proposed.It studied the computation method of reproducing kernel based on the Green’s functionsof three kinds of differential operator, i.e. differential operator with m-order, with m dif-ferent eigenvalues and simpler cases, and presented an automatic, flexible and rigorouscomputation algorithm. 3) According to the lack of theoretic evidence on the so-called”best”kernel of Gaussian kernel, the third kernel construction method based on static sce-nariopriorknowledgeisintroduced. Itfocusesonquantitativelyanalyzingandcomparingseveral kernels, including Gaussian kernel, two new reproducing kernels, and performingthe experiments using multiple criteria and synthetic problems. The results show that thereproducing kernel is an equivalent or even better kernel than RBF for the problems withmore input variables (more than 5, especially more than 10) and higher nonlinearity.
     (3) The third part has studied the problems of designing the restrictions and trainingmodel. According to the disadvantages of standard support vector regression training al-gorithm, i.e. disadvantageinparameterselection, limitationinthegeneralizationcapacityon changeable data set, difficult in incorporating and evaluating the prior knowledge, anew regression algorithm, i.e. Adaptive Motivated Support Vector Regression (AMSVR)is presented. Based on the introduction of the function and realization of all the compo-nents in AMSVR, two important components are discussed in detail. Firstly, two kindsof designing and solving approaches of programming problem are presented. Secondly,a new training algorithm, i.e. approximation incremental algorithm that is hybrid withquick training algorithm and accurate incremental algorithm, is proposed.
     (4)According to the application requirementin high-leveldecision-making analysis,the theory of this dissertation is demonstrated by an example of motivated metamodelling of radar system in the background of strategy missile breaking through NMD defensesystem simulation.
     The main contributions of this dissertation include: a new motivated metamodellingmethod, i.e. motivated metamodelling based support vector machines, which combinedwith the classical motivated metamodelling of RAND and support vector regression, ispresented and the formal conceptual framework and modelling framework are summa-rized. Three different methods for constructing kernel function based on different ap-plication requirement and prior knowledge are proposed. According to the problems, i.e.parameterselection,generalizationabilityandknowledgeevaluation,anewtrainingalgo-rithm, i.e. Adaptive Motivated Support Vector Regression (AMSVR) is presented, whichemploys an improved genetic algorithm to solve the parameter selection and knowledgeevaluation and uses an approximation incremental algorithm to improve the generaliza-tion ability for a changeable experiment data set.
     The research topics of this dissertation belong to the fundamental M&S theory andmachine learning categories. Above contributions can advance the system modelling andsimulation methodology, and promote the studies of multi-resolution, machine learning,data mining and experimental design in complex high-level decision-making analysis.
引文
[1] Sousa-Poza A, Kovacic S, Keating C. System of Systems Engineering: AnEmerging Multidiscipline [J]. International Journal of Systems Engineering. 2008,1 (1/2): 1–17.
    [2]胡晓峰,司光亚,罗批等.战争模拟:复杂性的问题与思考[J].系统仿真学报.2003, 15 (12): 1659–1666.
    [3]周少平.面向武器装备体系论证的探索性分析方法研究[D].长沙:国防科学技术大学, 2006.
    [4]李建平.仿真元建模中的拟合方法及其应用研究[D].长沙:国防科学技术大学,2007.
    [5]刘宝宏.多分辨率建模的理论与关键技术研究[D].长沙:国防科学技术大学,2003.
    [6]杨峰.面向效能评估的平台级体系对抗仿真跨层次建模方法研究[D].长沙:国防科学技术大学, 2003.
    [7] WalkerW,HarremoesP,RotmansJ,etal.DefiningUncertainty:AConceptualBa-sis for Uncertainty Management in Model-Based Decision Support [J]. IntegratedAssessment. 2003, 14 (1): 5–17.
    [8] Johnson M V R, McKeon M F, Szanto T R. Simulation Based Acquisition: A NewApproach, A462953 [R], Defense Systems Management College. 1998.
    [9]雷永林.仿真模型重用理论、方法与异构集成技术研究[D].长沙:国防科学技术大学, 2006.
    [10] Davis P K, Bigelow J H. Motivated Metamodels: Synthesis of Cause-effect Rea-soning and Statistical Metamodeling, MR-1570 [R], RAND. 2003.
    [11] Box G E P, Draper N R. Empirical Model Building and Response Surfaces [M].NY, USA: John Wiley&Sons, 1987.
    [12] CaughlinD.MetamodelingTechniquesandApplications,AFRL-IF-RS-TR-2000-1 [R], University of Colorado. 2000.
    [13] Gorissen D. Heterogeneous Evolution of Surrogate Models [D]. Belgium:Katholieke Universiteit Leuven (KUL), 2007.
    [14] LiJ-P,BaiF-S,ZhangW,etal.MetamodelMethodofOptimalityAnalysisforLin-ear Programming under the Right-hand-side Vector Partially Alterable [J]. Journalof Information and Computational Science. 2006, 3 (3): 477–486.
    [15] Barton R R. Metamodeling: A-state-of-the-art Review [C] // Tew J D, Manivan-nan S, Sadowski D A, et al. In Proceedings of the 1994 Winter Simulation Con-ference. 1994: 237–244.
    [16] Meisel W, Collinis D. Repro-Modeling: An Approach to Efficient Model Utiliza-tion and Interpretation [J]. IEEE Transactions on Systems, Man and Cybernetics.1973, 3 (4): 349–358.
    [17] Liem R P. Surrogate Modeling for Large-Scale Black-Box Systems [D]. Mas-sachusetts: Massachusetts Institute of Technology, 2007.
    [18] Box G E P, Wilson K B. On the Experimental Attainment of Optimum Condi-tions [J]. Journal of the Royal Statistical Society Series B. 1951, 13 (1): 1–45.
    [19] Jones D R. A Taxonomy of Global Optimization Methods Based on ResponseSurfaces [J]. Journal of Global Optimization. 2001, 21: 345–383.
    [20] Preda C. Regression Models for Functional Data by Reproducing Kernel HilbertSpaces Methods [J]. Journal of Statistical Planning and Inference. 2007, 137:829–840.
    [21] Santos I R, Santos P R. Simulation Metamodels for Modeling Output DistributionParameters [C] // Henderson S G, Biller B, Hsieh M-H, et al. In Proceedings of the2007 Winter Simulation Conference. 2007: 910–918.
    [22] Gao Y, Wang X. Surrogate-based Process Optimization for Reducing Warpagein Injection Molding [J]. Journal of Materials Processing Technology. 2009, 209:1302–1309.
    [23] Forrester A I J, Keane A J. Recent Advances in Surrogate-based Optimization [J].Progress in Aerospace Sciences. 2009, 45: 50–79.
    [24] Li Y F, Ng S H, Xie M, et al. A Systematic Comparison of Metamodeling Tech-niques for Simulation Optimization in Decision Support Systems. 2010. http://doi:10.1016/j.asoc.2009.11.034.
    [25] Davis P K, Bankes S C, Egner M. Enhancing Strategic Planning with MassiveScenario Generation: Theory and Experiments [R], RAND. 2007.
    [26]周少平,李群,王维平.支持探索性分析的主动元建模[J].系统仿真学报.2007, 19 (2): 237–239.
    [27]李小波.主动元建模框架及其应用研究[D].长沙:国防科学技术大学, 2007.
    [28]华玉光,徐浩军,刘凌.航空装备对地突击效能评估主动元模型研究[J].系统仿真学报. 2007, 19 (21): 4860–4863.
    [29]臧垒,蒋晓原,王钰等.军事通信网的启发式元模型研究[J].系统仿真学报.2009, 21 (17): 5562–5567.
    [30]杨坚,罗四维,刘蕴辉.一种基于广义KL距离和几何曲率的模型选择准则[J].电子学报. 2005, 33 (12): 2272–2277.
    [31] Vapnik V. The Nature of Statistical Learning Theory [M]. New York: Springer-Verlag, 1995.
    [32] Smola A J, Scho¨lkopf B, Mu¨ller K-R. The Connection Between RegularizationOperators and Support Vector Kernels [J]. Neural Networks. 1998, 11: 637–649.
    [33] Cristianini N, Shawe-Taylor J. An Introduction to Support Vector Machines andOther Kernel-based Learning Methods [M]. Cambridge, U.K: Cambridge Univer-sity Press, 2000.
    [34] Smola A J, Scho¨lkopf B. A Tutorial on Support Vector Regression [J]. Statisticsand Computing. 2004, 14 (3): 199–222.
    [35] Burges C J. A Tutorial on Support Vector Machines for Pattern Recognition [J].Data Mining and Knowledge Discovery. 1998, 2: 121–167.
    [36] Scho¨lkopf B, Sung K-K, Burges C J, et al. Comparing Support Vector Machineswith Gaussian Kernels to Radial Basis Function Classifiers [J]. IEEE Transactionson Signal Processing. 1997, 45: 2758–2765.
    [37] Nordstrom G G. Metamodeling - Rapid Design and Evolution of Domain-SpecificModeling Environments [D]. Nashville, Tennessee: Vanderbilt University, 1999.
    [38] Kleijnen J P C. Statistical Tools for Simulation Practitioners [M]. NY: MarcelDekker Inc, 1987.
    [39] Kleijnen J P C, Sargent R G. A Methodology for Fitting and Validating Metamod-els in Simulation [J]. European Journal of Operational Research. 2000, 120 (1):14–29.
    [40] Kleijnen J P C, Sanchez S M, Lucas T W, et al. State-of-the-Art Review: A User’sGuide to the Brave New World of Designing Simulation Experiments [J]. IN-FORMS Journal on Computing. 2005, 17 (3): 263–289.
    [41] Barton R R. Simulation Metamodels [C] // Medeiros D J, Watson E F, Carson J S,et al. In Proceedings of the 1998 Winter Simulation Conference. 1998: 167–174.
    [42] Myers R H. Response Surface Methodology [M]. Boston: Allyn & Bacon, 1976.
    [43] Mullur A A, Messac A. Higher Metamodel Accuracy Using Computationally Ef-ficient Pseudo Response Surfaces [C]. In 46th AIAA/ASME/ASCE/AHS/ASCStructures, Structural Dynamics & Materials Confer. Austin, Texas, 2005: 1–17.
    [44] Noguera J H, Watson E F. Response Surface Analysis of A Multi-product BatchProcessing Facility Using A Simulation Metamodel [J]. International Journal ofProduction Economics. 2006, 102: 333–343.
    [45] Friedman J H. Multivariate Adaptive Regression Splines [J]. Annals of Statistics.1991, 19 (1): 1–67.
    [46] Vogel D S, Wang M C. 1-dimensional splines as building blocks for improvingaccuracyofriskoutcomesmodels[C].InProceedingsoftheTenthACMSIGKDDInternational Conference on Knowledge Discovery and Data Mining. Seattle, WA,USA, 2004: 841–846.
    [47] Sacks J, Welch W J, Mitchell T J, et al. Design and Analysis of Computer Experi-ments [J]. Statistical Science. 1989, 4 (4): 409–423.
    [48] Simpson T W, Peplinski J D, Koch P N, et al. On The Use of Statistics in Designand The Implications for Deterministic Computer Experiments [C]. In Proceed-ings of the 1997 ASME Design Engineering Technical Conferences. Sacramento,California, 1997: 1–14.
    [49] Santner T J, Williams B J, Notz W. The Design and Analysis of Computer Exper-iments [M]. NY: Springer-Verlag, 2003.
    [50] van Beers W C M, Kleijnen J P C. Kriging Interpolation in Simulation: A Sur-vey [C] // Ingalls R G, Rossetti M D, Smith J S, et al. In Proceedings of the 2004Winter Simulation Conference. 2004: 113–120.
    [51] Kleijnen J P C, van Beers W C M. Application-driven Sequential Designs forSimulation Experiments: Kriging Metamodeling [J]. Journal of the OperationalResearch Society. 2004, 55 (9): 876–883.
    [52] van Beers W C M, Kleijnen J P C. Customized Sequential Designs for RandomSimulation Experiments: Kriging Metamodeling and Bootstrapping [J]. EuropeanJournal of Operational Research. 2008, 186 (3): 1099–1113.
    [53] Hardy R L. Multiquadric Equations of Topography and Other Irregular Sur-faces [J]. Journal of Geophysical Research. 1971, 76 (8): 1905–1915.
    [54] Buhmann M D, Ablowitz M J, Davis S H, et al. Radial Basis Functions : Theoryand Implementations [M]. UK: Cambridge University Press, 2003.
    [55] Wendland H. Scattered Data Approximation [M]. Cambridge: Cambridge Univer-sity Press, 2005.
    [56] Fishwick P A. Neural Network Models in Simulation: A Comparison with Tradi-tional Modeling Approaches [C]. In Proceedings of the 1992 Winter SimulationConference. Washington, DC, 1989: 702–710.
    [57] Madey G, Weinroth J. Neural Networks and General Purpose Simulation The-ory [C]. In Proceedings of the International Joint Conference on Neural Networks.Washington, DC, 1990: 647–650.
    [58] Padgett M L, Roppel T A. Neural Networks and Simulation: Modeling for Appli-cations [J]. Simulation. 1992, 58 (5): 295–304.
    [59] Lenard M, Alam P, Madey G. The Application of Neural Networks and a Quali-tative Response Model to the Auditor’s Going Concern Uncertainty Decision [J].Decision Sciences. 1995, 26 (2): 209–227.
    [60]曾连荪.基于人工神经网络的导航数据处理方法研究[J].上海海运学院学报.1990, 20 (3): 23–28.
    [61]顾强生,施斌等.人工神经网络模型在均匀设计试验数据处理中的应用[J].河海大学学报. 2000, 1: 83–86.
    [62]张丽.基于神经网络的仿真元建模方法研究[D].长沙:国防科学技术大学,2003.
    [63] Zentner J, Volovoi V, Mavris D. Preliminary Evaluation of a Hierarchical Meta-modeling Technique for Systems with a Large Number of Parameters [C]. In45th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics & Materi-als Conference. Palm Springs, California, 2004: 1–11.
    [64] Jin R, Chen W, Simpson T W. Comparative Studies of Metamodeling TechniquesUnder Multiple Modeling Criteria [J]. Structural and Multidisciplinary Optimiza-tion. 2000, 23: 1–13.
    [65] Hendrickx W, Dhaene T. Sequential Design and Rational Metamod-elling [C] // Kuhl M E, Steiger N M, Armstrong F B, et al. In Proceedingsof the 2005 Winter Simulation Conference. Orlando, 2005: 290–298.
    [66] Isabel Reis dos Santos M, PortaNova AM O. TheMain Issuesin Nonlinear Simu-lation Metamodel Estimation [C] // Farrington P A, Nembhard H B, Sturrock D T,et al. In Proceedings of the 1999 Winter Simulation Conference. 1999: 502–509.
    [67] Starbird S A. A Metamodel Specification for a Tomato Processing Plant [J]. Jour-nal of Operational Research Society. 1990, 41 (3): 229–240.
    [68] Yu B, Popplewell K. Metamodels in Manufacturing: A review [J]. InternationalJournal of Production Research. 1994, 32 (4): 787–796.
    [69] Kleijnen J P C, Standridge C R. Experimental Design and Regression Analysis inSimulation: An FMS Case Study [J]. European Journal of Operational Research.1988, 33 (3): 257–261.
    [70] Scharl J, Mavris D N. Building Parametric and Probabilistic Dynamic VehicleModels Using Neural Networks. 2001.
    [71] Mourelatos Z P, Kuczera R C, Latcha M. An Efficient Monte Carlo ReliabilityAnalysis Using Global and Local Metamodels. 2006.
    [72] McDonald D B, Grantham W J, Tabor W L, et al. Global and Local OptimizationUsing Radial Basis Function Response Surface Models [J]. Applied MathematicalModelling. 2007, 31: 2095–2110.
    [73] Chung H-S, Alonso J J. Comparison of Approximation Models with Merit Func-tions for Design Optimization. 2000.
    [74] S?ndergaardJ.OptimizationUsingSurrogateModelsbytheSpaceMappingTech-nique [D]. Lyngby, Denmark: Technical University of Denmark, 2003.
    [75] De Geest J, Dhaene T, Fach′e N, et al. Adaptive CAD-Model Building Algorithmfor General Planar Microwave Structures [J]. IEEE transactions on MicrowaveTheory and Techniques. 1999, 47 (9): 1801–1809.
    [76] Siah E S, Ozdemir T, Volakis J L, et al. Fast Parameter Optimization Using Krig-ing Metamodeling [C]. In IEEE Antennas and Propagation Society InternationalSymposium. Columbus, Ohio, USA, 2003: 76–79.
    [77] Diener D A. Forecasting Air Base Operability in a Hostile Environment: Estimat-ing Metamodels from Large-Scale Simulations [D]. West Lafayette, Indiana: Pur-due University, 1989.
    [78] Kleijnen J P C. Statistical Validation of Simulation Models : A Case Study [J].European Journal of Operational Research. 1995, 87 (1): 21–34.
    [79] Campbell P W. The Development of a Metamodel for a Major Weapon SystemCost Model [D]. Wastington, DC: Air Force Institute of Technology, 1995.
    [80] Meckesheimer M, Barton R R, Simpson T W. Metamodeling of Combined Dis-crete/Continuous Responses [J]. AIAA Journal. 2001, 39 (10): 1950–1959.
    [81] Joerding W H, Meador J L. Encoding a Priori Information in Feedforward Net-works [J]. Neural Networks. 1991, 4 (6): 847–856.
    [82] Chen C-W, Chen D-Z. Prior-knowledge-based Feedforward Network SimulationofTrueBoilingPointCurveofCrudeOil[J].Computers&Chemistry.2001,25(6):541–550.
    [83] Pradhan M, Provan G M, Middleton B, et al. Knowledge Engineering for LargeBelief Networks [C]. In Proceedings of the Tenth Annual Conference on Uncer-tainty in Artificial Intelligence. Seattle, Washington, 1994: 484–490.
    [84] Jensen F V, Nielsen T D. Bayesian Networks and Decision Graphs [M]. 2nd ed.NY: Springer-Verlag, 2007.
    [85] MaduCN.AFuzzyTheoreticApproachtoSimulationMetamodeling[J].AppliedMathematics Letters. 1995, 8 (6): 35–41.
    [86] HuberK-P,BertholdMR.SimulationdataanalysisusingFuzzyGraphs[M]//Hu-ber K-P, Berthold M R. Advances in Intelligent Data Analysis Reasoning aboutData. NY: Springer Berlin, 1997: 347–358.
    [87] Suykens J, Vandewalle J. Least Squares Support Vector Machine Classifiers [J].Neural Processing Letters. 1999, 9 (3): 293–300.
    [88] SuykensJAK,BabanterTD,LukasL,etal.WeightedLeastSquaresSupportVec-tor Machines: Robustness and Sparse Approximation [J]. Neurocomputing. 2002,48: 85–105.
    [89]徐立祥.基于最小二乘再生核支持向量机的信号回归[D].哈尔滨:哈尔滨理工大学, 2008.
    [90] Guo Z, Bai G. Application of Least Squares Support Vector Machine for Re-gression to Reliability Analysis [J]. Chinese Journal of Aeronautics. 2009, 22:160–166.
    [91] Pelckmans K, Suykens J A K, Gestel T V, et al. LS-SVMlab: A MATLAB/C Tool-box for Least Squares Support Vector Machines. 2003.
    [92] Lin C F, Wang S D. Fuzzy Support Vector Machines [J]. IEEE Transactions onNeural Networks. 2002, 13 (2): 464–471.
    [93] Huang H-P, Liu Y-H. Fuzzy Support Vector Machines for Pattern Recognition andData Mining [J]. International Journal of Fuzzy Systems. 2002, 4 (3): 826–835.
    [94]刘畅.模糊支持向量机[D].大连:辽宁师范大学, 2008.
    [95] Hong D H, Hwang C. Support Vector Fuzzy Regression Machines [J]. Fuzzy Setsand Systems. 2003, 138 (2): 271–281.
    [96]华守亮,王宜静.模糊支撑向量回归[J].河南师范大学学报(自然科学版).2008, 36 (4): 5–8.
    [97]杜树新,吴铁军.回归型加权支持向量机方法及其应用[J].浙江大学学报(工学版). 2004, 38 (3): 302–306.
    [98] Wang M, Yang J, Liu G-P, et al. Weighted-Support Vector Machines for PredictingMembrane Protein Types based on Pseudo-Amino Acid Composition [J]. ProteinEngineering Design and Selection. 2004, 17 (6): 509–516.
    [99] HuaG-S,ZhuF-F,RenZ.PowerQualityDisturbanceIdentificationUsingWaveletPacket Energy Entropy and Weighted Support Vector Machines [J]. Expert Sys-tems with Applications. 2008, 35 (1-2): 143–149.
    [100]冯一宁,邵元海,陈静等.基于层次聚类的大样本加权支持向量机[J].计算机工程与设计. 2009, 39 (1): 175–178.
    [101] AsharafS,ShevadeSK,MurtyNM.RoughSupportVectorClustering[J].PatternRecognition. 2005, 38 (10): 1779–1783.
    [102] Wang L S, Xu Y T, Zhao L S. A Kind of Hybrid Classification Algorithm Basedon Rough Set and Support Vector Machine [C]. In Proceedings of the Fourth In-ternational Conferenceon MachineLearningandCybernetics.Guangzhou, China,2005: 1676–1679.
    [103] ZhangJ,WangY.ARoughMarginBasedSupportVectorMachine[J].InformationSciences: an International Journal. 2008, 178 (9): 2204–2214.
    [104]梁宏霞,闫德勤.粗糙支持向量机[J].计算机应用. 2009, 36 (4): 208–210.
    [105] Lingras P, Butz C J. Rough Support Vector Regression [J]. Computational Intelli-gence and Information Management. 2010, 206 (2): 445–455.
    [106] Boser B E, Guyon I M, Vapnik V. A Training Algorithm for Optimal Margin Clas-sifiers [C] // Haussler D. In Proceedings of the 5th Annual ACM Workshop onComputational Learning Theory. Pittsburgh, PA, 1992: 144–152.
    [107] Osuna E E, Freund R, Girosi F. Training Support Vector Machines: An Applica-tion to Face Detection [C]. In IEEE Conference on Computer Vision and PatternRecognition. 1997: 130–136.
    [108] Platt J C. Fast Training of Support Vector Machines using Sequential MinimalOptimization [M] // Scho¨lkopf B, Burges C J, Smola A J. Advances in KernelMethods-Support Vector Learning. Cambridge, England: MIT Press, 1999:.
    [109] Joachims T. Making Large-scale SVM Learning Practical [M] // Scho¨lkopf B,Burges C J, Smola A J. Advances in Kernel Methods-Support Vector Learning.Cambridge, England: MIT Press, 1999:.
    [110] Syed N A, Huan S, Kah L, et al. Incremental Learning with Support Vector Ma-chines [C]. In Workshop on Support Vector Machines (IJCAI99). Stockholm,Swede, 1999: 352–356.
    [111]周伟达.核机器学习方法研究[D].西安:西安电子科技大学, 2003.
    [112] Sch(o|¨)lkopfB.TheKernelTrickforDistances[J].NeuralInformationProcess.Sys-tems (NIPS). 2000, 13.
    [113] Sch(o|¨)lkopfB,BurgesCJ,SmolaAJ.AdvancesinKernelMethods-SupportVectorLearning [M]. Cambridge, England: The MIT Press, 1999.
    [114] Smola A J, Friess T, M(u|¨)ller K-R. Semiparametric Support Vector and Linear Pro-gramming Machines [J]. Advances in Neural Information Processing Systems.1998, 11: 585–591.
    [115] Wahba G. Support Vector Machines, Reproducing Kernel Hilbert Spaces and Ran-domized GACV [M] // Scho¨lkopf B, Burges C J, Smola A J. Advances in KernelMethods-SupportVectorLearning.Cambridge,England:MITPress,1999:69–88.
    [116] SchabackR.NativeHilbertSpacesforRadialBasisFunctionsI[C]//MüllerMW,BuhmannMD,MacheDH,etal.InNewDevelopmentsinApproximationTheory,InternationalSeriesofNumericalMathematics.Birkh¨auser,Basel,1999:255–282.
    [117] Opfer R. Multiscale kernels [J]. Advances in Computational Mathematics. 2006,25: 357–380.
    [118]段崇雯.多尺度核函数支持向量机算法及其应用研究[D].长沙:国防科学技术大学, 2006.
    [119] SteinwartI.OntheInfluenceoftheKernelontheGeneralizationAbilityofSupportVector Machines, TR-01-01 [R], Friedrich Schiller University. 2001.
    [120] Micchelli C A. Interpolation of Scattered Data: Distance Matrices and Condition-ally Positive Definite Functions [J]. Constructive Approximation. 1986, 2: 11–22.
    [121] AmariS,WuS.ImprovingSupportVectorMachineClassifiersbyModifyingKer-nel Functions [J]. Neural Networks. 1999, 12: 783–789.
    [122] Cui M-G, Deng Z-X. On the Best Operator of Interpolation [J]. Math. NumericaSinica. 1986, 8 (2): 209–216.
    [123]崔明根,吴勃英.再生核空间数值分析[M].北京:科学出版社, 2004.
    [124] Ling J, Li Y-S. A New Method for Computing Reproducing Kernels [J]. NortheastMath. J. 1998, 14 (4): 467–473.
    [125]张伟.再生核的构造及再生核空间的若干逼近问题[D].长沙:国防科学技术大学, 2005.
    [126] Keerthi S S, Lin C-J. Asymptotic Behaviors of Support Vector Machines withGaussian Kernel [J]. Neural Computation. 2003, 15 (7): 1667–1689.
    [127] Lin H-T, Lin C-J. A Study on Sigmoid Kernels for SVM and the Training of non-PSD Kernels by SMO-type Methods [R], National Taiwan University. 2003.
    [128]李盼池,许少华.支持向量机在模式识别中的核函数特性分析[J].计算机工程与设计. 2005, 26 (2): 302–304.
    [129] Smits G F, Jondaan E M. Improved SVM Regression Using Mixtures of Ker-nels [C]. In Proceeding of the 2002 international joint conference on neural net-works. Honolulu, Hawaii, USA, 2002.
    [130] Duan K, Keerthi S S, Poo A N. Evaluation of Simple Performance Measures forTuning SVM Hyperparameters [J]. Neurocomputing. 2003, 51: 41–59.
    [131] Tan Y, Wang J. A Support Vector Machine with a Hybrid Kernel and MinimalVapnik-Chervonenkis Dimension [J]. IEEE Transactions on Knowledge and DataEngineering. 2004, 16: 385–395.
    [132] Vapnik V, Chapelle O. Bounds on Error Expectation for Support Vector Ma-chines [J]. Neural Computation. 2000, 12 (9): 2013–2036.
    [133] Nason G P. Wavelet Shrinkage Using Cross-Validation [J]. Journal of the RoyalStatistical Society. 1996, Series B. 58: 463–479.
    [134] Bloch G, Lauer F, Colin G, et al. Support Vector Regression from Simulation Dataand few experimental samples [J]. Information Sciences. 2008, 178: 3813–3827.
    [135] Johansen T. Identification of Non-linear Systems Using Empirical Data and PriorKnowledge: An Optimization Approach [J]. Automatica. 1996, 32 (3): 337–356.
    [136] Lázaro M, Pérez-Cruz F, Artés-Rodriguez A. Learning A Function and Its Deriva-tive Forcing the Support Vector Expansion [J]. IEEE Signal Processing Letters.2005, 12: 194–197.
    [137] Lázaro M, Santamaria I, Pérez-Cruz F, et al. Support Vector Regression for theSimultaneous Learning of A Multivariate Function and Its Derivatives [J]. Neuro-computing. 2005, 69: 42–61.
    [138] Lauer F, Bloch G. Incorporating Prior Knowledge in Support Vector Regres-sion [J]. Machine Learning. 2008, 70: 89–118.
    [139] Sánchez-Fernández M, de Prado-Cumplido M, Arenas-Garc′?a J, et al. SVM Mul-tiregression for Nonlinear Channel Estimation in Multiple-Input Multiple-OutputSystems [J]. IEEE Transactions on Signal Processing. 2004, 52 (8): 2298–2307.
    [140] Weston J, Chapelle O, Elisseeff A, et al. Kernel Dependency Estimation [J]. Ad-vances in Neural Information Processing Systems. 2003, 15: 873–880.
    [141] Brudnak M. Vector-valued Support Vector Regression [C]. In IEEE InternationalJoint Conference on Neural Networks. Vancouver, BC, 2006: 1562–1569.
    [142] Fung G, Mangasarian O L, Shavlik J. Knowledge-based Support Vector MachineClassifiers [C] // Becker S, Thrun S, Obermayer K. In In Advances in Neural In-formation Processing Systems. Cambridge, MA., 2002: 1–9.
    [143] Mangasarian O L, Shavlik J, Wild E W. Knowledge-based Kernel Approxima-tion [J]. Journal of Machine Learning Research. 2004, 5: 1127–1141.
    [144] Mangasarian O L, Wild E W. Nonlinear Knowledge in Kernel Approximation [J].IEEE Transactions on Neural Networks. 2007, 18: 300–306.
    [145] Wang L, Xue P, Chan K L. Incorporating Prior Knowledge into SVM for Im-age Retrieval [C]. In Proceedings of the 17th International Conference on PatternRecognition. Washington D. C., USA, 2004: 981–984.
    [146] Ahmad A R, Khalid M, Yusof R. Kernel Methods and Support Vector Machinesfor Handwriting Recognition [C]. In Student Conference on Research and Devel-opment, SCOReD 2002. 2002: 309–312.
    [147] BlanzV,Scho¨lkopfB,Bu¨lthoffH,etal.ComparisonofView-BasedObjectRecog-nition Algorithms Using Realistic 3D Models [C]. In Artificial Neural Networks—ICANN’96. Berlin, 1996.
    [148] Mu¨llerK-R,SmolaAJ,R¨atschG,etal.PredictingTimeSerieswithSupportVectorMachines [C] // Gerstner W, Germond A, Hasler M, et al. In Artificial NeuralNetworks ICANN’97. Berlin, 1997: 999–1004.
    [149] KhemchandaniR,Jayadeva,ChandraS.RegularizedLeastSquaresFuzzySupportVector Regression for Financial Time Series Forecasting [J]. Expert Systems withApplications. 2007, 36 (1): 132–138.
    [150] HeW-W,WangZ-Z,JiangH.ModelOptimizingandFeatureSelectingforSupportVector Regression in Time Series Forecasting [J]. Neurocomputing. 2008, 73 (3):600–611.
    [151] Lu Z, Sun J. Non-Mercer Hybrid Kernel for Linear Programming Support Vec-tor Regression in Nonlinear Systems Identification [J]. Applied Soft Computing.2009, 9: 94–99.
    [152]王定成,方廷健,高理富等.支持向量机回归在线建模及应用[J].控制与决策.2003, 17 (1): 89–91.
    [153] Suykens J A K, Vandewalle J, Moor B D. Optimal Control by Least Squares Sup-port Vector Machines [J]. Neural Networks. 2001, 14 (1): 23–35.
    [154] HouckCR,JoinesJA,KayMG.AGeneticAlgorithmforFunctionOptimization:A Matlab Implementation, TR 95-09 [R], NCSU-IE. 1995.
    [155]阎平凡,张长水.人工神经网络与模拟进化计算[M].北京:清华大学出版社,2000.
    [156] Zeigler B P, Praehofer H, Kim T G. Theory of Modeling and Simulation: Integrat-ing Discrete Event and Continuous Complex Dynamic Systems [M]. 2nd ed. NY,USA: Academic Press, 2000.
    [157] Vapnik V, Levin E, Cun Y L. Measuring the VC-dimension of a Learning Ma-chine. [J]. Neural Computation. 1994, 6: 851–876.
    [158] Myung J I, Pitt M A. Model Comparison Methods [J]. Methods in Enzymology.2004, 383: 351–366.
    [159] Schreiber G,史忠植,梁永全等译.知识工程和知识管理[M].北京:机械工业出版社, 2003.
    [160] Newell A. The Knowledge Level [J]. Artificial Intelligence. 1982, 18 (1): 81–127.
    [161] Vangheluwe H. Multi-FormalismModelling and Simulation [D]. Ghent, Belgium:Ghent University, 2000.
    [162] Davis P K. Introduction to Multiresolution, Multiperspective Modeling and Ex-ploratory Analysis, WR-224 [R], RAND. 2005.
    [163]张伟,吴红,刘娟等.支持鲁棒决策的探索性试验设计[J].系统仿真学报.2009, 21 (14): 4461–4466.
    [164] Doherty D, Freeman M A, Kumar R. Optimization with MATLAB and the Ge-neticAlgorithmandDirectSearchToolbox.2004.http://www.mathworks.com/products/gads.
    [165] Zhang W, Zhao X, Zhu Y-F, et al. A New Composition Method of AdmissibleSupport Vector Kernel Based on Reproducing Kernel [J]. Proceedings of WorldAcademy of Science, Engineering and Technology. 2010, 63: 345–353.
    [166] Mercer J. Functions of Positive and Negative Type and Their Connection with TheTheory of Integral Equations [M]. 1909.
    [167] Aronszajn N. Theory of Reproducing Kernels [J]. Transactions of the AmericanMathematical Society. 1950, 68 (3): 337–404.
    [168] Poggio T. On Optimal Nonlinear Associative Recall [J]. Biological Cybernetics.1975, 19: 201–209.
    [169] Lodhi H, Saunders C, Shawe-Taylor J, et al. Text Classification using String Ker-nels [J]. Journal of Machine Learning Research. 2004, 2 (3): 419–444.
    [170]周德强.基于标架型再生核的支持向量机[D].武汉:湖北大学, 2004.
    [171] Steinke F, Scho¨lkopf B. Kernels, Regularization and Differential Equations [J].Pattern Recognition. 2008, 41: 3271–3286.
    [172] Adams R A. Sobolev Spaces [M]. New York: Academic Press, 1975.
    [173] LiY-S.OntheRecurrenceRelationsforB-splinesDefinedbyCertainL-splines[J].Journal of Approximation theory. 1985, 43 (4): 359–369.
    [174]李岳生.样条与插值[M].上海:上海科技出版社, 1983.
    [175]王炜,郭小明,王淑艳等.关于核函数选取的方法[J].辽宁师范大学学报(自然科学版). 2008, 21 (1): 1–4.
    [176]张小云,刘允才.高斯核支撑向量机的性能分析[J].计算机工程. 2003, 29 (8):22–25.
    [177] Smola A J, Scho¨lkopf B, Mu¨ller K-R. General Cost Functions for Support VectorRegression [C]. In Proceedings of 9th Australian Conference on Neural Networks.Brisbane, Australia, University of Queensland, 1998: 79–83.
    [178] Scho¨lkopf B, Herbrich R, Smola A J. A Generalized Representer Theo-rem [M] // Scho¨lkopf B, Herbrich R, Smola A J. Computational Learning Theory.Berlin Heidelberg: Springer-Verlag, 2001: 416–426.
    [179] Chen K-Y. Forecasting Systems Reliability based on Support Vector Regressionwith Genetic Algorithms [J]. Reliability Engineering & System Safety. 2007,92 (4): 423–432.
    [180] Park Y-J. Application of Genetic Algorithms in Response Surface OptimizationProblems [D]. Phoenix: Arizona State University, 2003.
    [181]李敏强,寇纪淞,林丹等.遗传算法的基本理论与应用[M].北京:科学出版社,2002.
    [182] Jung J H, O’Leary D P, Tits A L. Adaptive Constraint Reduction for Training Sup-port Vector Machines [J]. Electronic Transactions on Numerical Analysis. 2008,31: 156–177.
    [183] Cauwenberghs G, Poggio T. Incremental and Decremental Support Vector Ma-chine Learning [J]. Machine Learning. 2001, 44 (13): 409–415.
    [184] Ma J, Theiler J, Perkins S. Accurate Online Support Vector Regression [J]. NeuralComputation. 2003, 15 (11): 2683–2703.
    [185]林玉琛.战区导弹防御(TMD)和国家导弹防御(NMD) [J].现代防御技术.2001, 29 (6): 1–9.
    [186]建模与仿真教研室. EPSS系统仿真模型设计报告, SLTR2008ERPAO02 [R],国防科学技术大学. 2008.
    [187]赛斯勒M,卢胜利,米建军译. NMD与反制NMD [M].北京:国防大学出版社, 2001.
    [188]张伟,刘娟,赵新等.基于SMP2的导弹攻防对抗仿真模型开发与集成[J].系统仿真学报. 2009, 21 (18): 5603–5607.
    [189]丁鹭飞,耿富录.雷达原理[M]. 3rd ed.西安:西安电子科技大学出版社, 2003.
    [190]王国玉,汪连栋.雷达电子战系统数学仿真与评估[M].北京:国防工业出版社, 2004.
    [191]刘娟.复杂对抗条件下的高空高超声速巡航导弹航迹规划方法研究[D].长沙:国防科学技术大学, 2010.
    [192] Srivastava A N, Schumann J, Fischer B. An Ensemble Approach to Building Mer-cer Kernels with Prior Information [C]. In IEEE International Conference on Sys-tems, Man and Cybernetics. 2005: 2352–2359.
    [193] Haussler D. Convolution Kernels on Discrete Structures, UCSC-CRL-99-10 [R],University of California, Sana Cruz, CA,. 1999.
    [194] Horn R A, Johnson C R. Topics in Matrix Analysis [M]. Cambridge, UK: Cam-bridge University Press, 1991.
    [195]邓乃扬,田英杰.数据挖掘中的新方法:支持向量机[M].北京:科学出版社,2004.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700