A constrained parameter evolutionary learning algorithm for Bayesian network under incomplete and small data
详细信息    查看官网全文
摘要
Lack of relevant data is a major challenge for Bayesian network(BN) parameters learning. For the issue, this paper proposes a constrained parameter evolutionary learning algorithm(CPEL) which is based on the qualitative knowledge and evolutionary strategy. In detail, firstly qualitative knowledge is employed into BN parameters learning process to reduce the parameter search space where two types of qualitative knowledge with experts' confidence are presented; and then evolutionary strategy is introduced into the process to avoid the problem that classical learning technique falls into local optimum easily in which the special encoding for the BN parameters is presented and some evolutionary strategies are discussed. So combining their advantages will have an important significance for BN parameters learning under incomplete and small data. Comparative experiments show that the CPEL algorithm is better than classical EM algorithm in accuracy and timeliness performance, which verify the feasibility and superiority of the algorithm proposed. Additionally, the CPEL algorithm has been applied to UAV threat assessment under complex dynamic environment.
Lack of relevant data is a major challenge for Bayesian network(BN) parameters learning. For the issue, this paper proposes a constrained parameter evolutionary learning algorithm(CPEL) which is based on the qualitative knowledge and evolutionary strategy. In detail, firstly qualitative knowledge is employed into BN parameters learning process to reduce the parameter search space where two types of qualitative knowledge with experts' confidence are presented; and then evolutionary strategy is introduced into the process to avoid the problem that classical learning technique falls into local optimum easily in which the special encoding for the BN parameters is presented and some evolutionary strategies are discussed. So combining their advantages will have an important significance for BN parameters learning under incomplete and small data. Comparative experiments show that the CPEL algorithm is better than classical EM algorithm in accuracy and timeliness performance, which verify the feasibility and superiority of the algorithm proposed. Additionally, the CPEL algorithm has been applied to UAV threat assessment under complex dynamic environment.
引文
[1]Kumar S,Tripathi B K.Modelling of Threat assessment for Dynamic Targets Using Bayesian Network Approach[J].Procedia Technology,2016,24:1268-1275.
    [2]Heckerman D.A Tutorial on Learning with Bayesian Networks[M].MIT Press,1999.
    [3]Geman S,Geman D.Stochastic relaxation,gibbs distributions,and the bayesian restoration of images.[J].IEEE Transactions on Pattern Analysis&Machine Intelligence,1987,6(6):564-584.
    [4]Dempster A P.Maximum likelihood estimation from incomplete data via the EM algorithm(with discussion[J].Journal of the Royal Statistical Society,1977,39(1):1-38.
    [5]Buntine W L.Operations for learning with graphical models[J].Journal of Artificial Intelligence Research,1994,2(1):159-225.
    [6]Redner R A,Walker H F.Mixture Densities,Maximum Likelihood and the EM Algorithm[J].Siam Review,2006,26(2):195-239.
    [7]Saluja A,Sundararajan P K,Mengshoel O J.Age-Layered Expectation Maximization for Parameter Learning in Bayesian Networks[J].Proceedings of Artificial Intelligence&Statistics La Palma Canary Islands,2012.
    [8]Ramoni M,Sebastiani P.Robust Learning with Missing Data[J].Machine Learning,2001,45(2):147-170.
    [9]Elidan G,Friedman N.The Information bottleneck EM algorithm[C]//Nineteenth Conference on Uncertainty in Artificial Intelligence.Morgan Kaufmann Publishers Inc.2002:200-208.
    [10]Elidan G,Ninio M,Friedman N,et al.Data perturbation for escaping local maxima in learning[C]//Eighteenth national conference on Artificial intelligence.American Association for Artificial Intelligence,2002:132-139.
    [11]Niculescu R S,Mitchell T M,Rao R B.Bayesian Network Learning with Parameter Constraints.[J].Journal of Machine Learning Research,2006,7(3):1357-1383.
    [12]Khan O Z,Poupart P,Cheriton D R,et al.Automated Refinement of Bayes Networks'Parameters based on Test Ordering Constraints[J].Advances in Neural Information Processing Systems,2011:2591-2599.
    [13]Hauskrecht T?.Learning to detect incidents from noisily labeled data[J].Machine Learning,2010,79(3):335-354.
    [14]Liao W,Ji Q.Learning Bayesian network parameters under incomplete data with domain knowledge[J].Pattern Recognition,2009,42(11):3046-3056.
    [15]Dan G,Heckerman D.A Characterization of the Dirichlet Distribution through Global and Local Parameter Independence[J].Annals of Statistics,1997,25(3):1344-1369.
    [16]Niculescu R S,Mitchell T M,Rao R B.A Theoretical Framework for Learning Bayesian Networks with Parameter Inequality Constraints.[C]//IJCAI 2007,Proceedings of the,International Joint Conference on Artificial Intelligence,Hyderabad,India,January.DBLP,2007:155-160.
    [17]Wittig F,Jameson A.Exploiting Qualitative Knowledge in the Learning of Conditional Probabilities of Bayesian Networks[C]//Conference on Uncertainty in Artificial Intelligence.Morgan Kaufmann Publishers Inc.2013:644-652.
    [18]Feelders A,Van d G L C.Learning Bayesian network parameters under order constraints[J].International Journal of Approximate Reasoning,2006,42(1-2):37-53.
    [19]De Campos C P,Ji Q.Improving Bayesian Network parameter learning using constraints[C]//International Conference on Pattern Recognition.IEEE Xplore,2008:1-4.
    [20]Chang R,Stetter M,Brauer W.Quantitative Inference by Qualitative Semantic Knowledge Mining with Bayesian Model Averaging[J].IEEE Transactions on Knowledge&Data Engineering,2008,20(12):1587-1600.
    [21]Ji Z,Xia Q,Meng G.A Review of Parameter Learning Methods in Bayesian Network[M]//Advanced Intelligent Computing Theories and Applications.Springer International Publishing,2015:3-12.
    [22]Binder J,Koller D,Russell S,et al.Adaptive Probabilistic Networks with Hidden Variables[J].Machine Learning,1997,29(2):213-244.
    [23]Yang S,Natarajan S.Knowledge Intensive Learning:Combining Qualitative Constraints with Causal Independence for Parameter Learning in Probabilistic Models[M]//Machine Learning and Knowledge Discovery in Databases.2013:580-595.
    [24]Zhou Y,Fenton N,Neil M.An Extended MPL-C Model for Bayesian Network Parameter Learning with Exterior Constraints[C]//The,European Workshop on Probabilistic Graphical MODELS.2014:581-596.
    [25]Masegosa A,s R,Feelders A J,et al.Learning from incomplete data in Bayesian networks with qualitative influences[J].International Journal of Approximate Reasoning,2016,69(C):18-34.
    [26]Goldberg D E.Genetic Algorithms in Search,Optimization and Machine Learning[J].1989,xiii.
    [27]Lauritzen S L,Spiegelhalter D J.Local Computations with Probabilities on Graphical Structures andtheir Application to Expert Systems[J].Journal of the Royal Statistical Society,1988,50(2):157-224.
    [28]Kullback S,Leibler R A.On Information and Sufficiency[J].Annals of Mathematical Statistics,1951,22(1):79-86.
    [29]Ren,J.,Gao,X.,&Bai,Y(2012)Discrete dynamic bn parameter learning under small sample and incomplete information.Systems Engineering&Electronics,34(8),1723-1728.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700