用户名: 密码: 验证码:
GPFR混合模型的动态模型选择算法
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Dynamic Model Selection Algorithm for GPFR Mixtures
  • 作者:赵龙波 ; 马尽文
  • 英文作者:Zhao Longbo;Ma Jinwen;Department of Information Science, School of Mathematical Sciences and LMAM, Peking University;
  • 关键词:高斯过程混合模型 ; 高斯过程函数回归混合模型 ; 动态模型选择算法 ; 同步平衡准则
  • 英文关键词:the mixture of Gaussian processes model;;the mixture of Gaussian process functional regressions;;dynamic model selection algorithm;;synchronous balancing criterion
  • 中文刊名:XXCN
  • 英文刊名:Journal of Signal Processing
  • 机构:北京大学数学科学学院信息科学系和数学及其应用教育部重点实验室;
  • 出版日期:2019-05-25
  • 出版单位:信号处理
  • 年:2019
  • 期:v.35;No.237
  • 基金:国家自然科学基金(61171138)资助
  • 语种:中文;
  • 页:XXCN201905008
  • 页数:9
  • CN:05
  • ISSN:11-2406/TN
  • 分类号:62-70
摘要
作为一种有效的数据建模和分析工具,高斯过程混合(MGP)模型被广泛地应用于时间序列的分析与预测,并成为一种新的机器学习模型。在传统的MGP模型中,高斯过程(GP)的均值被假设为零,这给其应用带来了很大的局限性,因此人们提出了可进行均值函数学习的高斯过程函数回归(GPFR)模型及其混合模型(MGPFR)进行更为精细的数据建模。与MGP模型一样,MGPFR模型同样存在着模型选择的问题。为了解决MGPFR模型的模型选择问题,本文将同步平衡准则进行了推广,并提出了相应的模型选择和动态模型选择算法,并通过实验发现了惩罚项系数的合理区间。实验表明,这些算法在模型选择和预测上均有很好表现,并且能够有效地应用于曲线聚类。
        As a powerful tool of data modeling and analysis, the mixture of Gaussian processes(MGP) is widely used in the fields of time series regression and prediction. In the conventional MGP models, the mean function of each GP model is assumed to be zero, but this assumption is not reasonable for many practical applications. In order to get rid of this limitation, Gaussian process functional regression(GPFR) is constructed to make the mean function learnable so that the mixture of GPRs(MGPFR) is more flexible for time series modeling. In the same way, we meet the model selection problem when using the MGPFR model. In order to solve this problem, we generalize the SBC and propose the model selection and dynamic model selection algorithms to the case of the MGPFR models. It is demonstrated by the experiments that the model selection and dynamic model selection algorithms for the MGPFR model perform well on both model selection and prediction, and can be successfully applied to curve classification.
引文
[1] Murphy K P.Machine Learning:A Probabilistic Perspective[M].MIT Press,2012:22-71.
    [2] MacKay D J C.Introduction to Gaussian processes[J].NATO ASI Series F Computer and Systems Sciences,1998,168:133-166.
    [3] Snelson E,Ghahramani Z,Rasmussen C E.Warped gaussian processes[C]//Advances in Neural Information Processing Systems,2004:337-344.
    [4] Yuan C,Neubauer C.Variational mixture of Gaussian process experts[C]//Advances in Neural Information Processing Systems,2009:1897-1904.
    [5] Tresp V.Mixtures of Gaussian processes[C]//Advances in Neural Information Processing Systems,2001:654- 660.
    [6] Zhao L,Chen Z,Ma J.An effective model selection criterion for mixtures of gaussian processes[C]//International Symposium on Neural Networks.Springer,Cham,2015:345-354.
    [7] Zhao L,Ma J.A dynamic model selection algorithm for mixtures of Gaussian processes[C]//Signal Processing(ICSP),2016 IEEE 13th International Conference on.IEEE,2016:1095-1099.
    [8] Ross J,Dy J.Nonparametric mixture of Gaussian processes with constraints[C]//International Conference on Machine Learning,2013:1346-1354.
    [9] Stachniss C,Plagemann C,Lilienthal A J,et al.Gas distribution modeling using sparse Gaussian process mixture models[C]//Robotics:Science and Systems Conference 2008,Zürich,Switzerland,June 25-28.MIT press,2008:310-317.
    [10] Shi J Q,Murray-Smith R.Hierarchical Gaussian process mixtures for regression[J].Statistics and Computing,2005,15(1):31- 41.
    [11] Wu D,Ma J.A Two-Layer Mixture Model of Gaussian Process Functional Regressions and Its MCMC EM Algorithm[J].IEEE Transactions on Neural Networks and Learning Systems,2018,29(10):4894- 4904.
    [12] Akaike H.A new look at the statistical model identification[J].IEEE Transactions on Automatic Control,1974,19(6):716-723.
    [13] Schwarz G.Estimating the dimension of a model[J].The Annals of Statistics,1978,6(2):461- 464.
    [14] Xu L.Bayesian Ying-Yang system,best harmony learning,and five action circling[J].Frontiers of Electrical and Electronic Engineering in China,2010,5(3):281-328.
    [15] Rasmussen C E,Ghahramani Z.Infinite mixtures of Gaussian process experts[C]//Advances in Neural Information Processing Systems,2002:881- 888.
    [16] Ma J,He Q.A dynamic merge-or-split learning algorithm on gaussian mixture for automated model selection[C]//International Conference on Intelligent Data Engineering and Automated Learning.Springer,Berlin,Heidelberg,2005:203-210.
    [17] Ohishi Y,Mochihashi D,Kameoka H,et al.Mixture of Gaussian process experts for predicting sung melodic contour with expressive dynamic fluctuations[C]//Acoustics,Speech and Signal Processing(ICASSP),2014 IEEE International Conference on.IEEE,2014:3714-3718.
    [18] Michalis K.Titsias.Variational heteroscedastic Gaussian process regression[C]//International Conference on International Conference on Machine Learning,2011:841- 848.
    [19] Chen Z,Ma J,Zhou Y.A precise hard-cut EM algorithm for mixtures of Gaussian processes[C]//International Conference on Intelligent Computing.Springer,Cham,2014:68-75.
    [20] Jones H E,Bayley N.The Berkeley growth study[J].Child Development,1941:167-173.
    [21] López-Pintado S,Romo J.On the concept of depth for functional data[J].Journal of the American Statistical Association,2009,104(486):718-734.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700