判别贝叶斯网络的学习算法及其应用研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
贝叶斯网络分类器在很多领域有广泛的应用。为了更好的解决分类问题,出现了两种不同的扩展贝叶斯网络分类器。一是网络结构的扩展,这方面的代表有朴素贝叶斯分类器和TAN分类器;另一种是学习方法的扩展,即基于判别学习的贝叶斯网络分类器。从学习的目的性来讲,贝叶斯网络的学习方法包括生成学习和判别学习。自贝叶斯网络出现以来,主要研究了生成学习方法,而判别学习相关的研究则相对很少。本文从判别学习的角度,围绕实际问题中的分类代价不平衡数据、属性缺值数据以及类别缺值数据,研究了基于判别学习的贝叶斯网络的几种算法。本文的主要研究内容如下:
     (1)论文总结了现有的贝叶斯网络的生成学习算法和判别学习算法,并从几个不同的角度对生成贝叶斯网络和判别贝叶斯网络进行了实验对比。
     (2)在分类代价不平衡数据的判别贝叶斯网络的学习中,针对样本数据分类代价的不平衡性,在判别贝叶斯网络学习的基础上,提出了贝叶斯网络的代价敏感参数和代价敏感结构的学习算法。在参数学习算法中提出了一种代价敏感损失函数作为目标函数,并应用共轭梯度法进行求解;而在结构学习中则提出了代价敏感准则用于贝叶斯网络的结构学习,这种代价敏感准则是关于分类代价和分类精度的双重评分准则。
     (3)在属性缺值数据的判别贝叶斯网络的学习中,针对实际问题中存在的属性缺值数据,研究了判别贝叶斯网络学习的CEM算法。提出了一种使得CEM算法收敛的Q函数,分析了收敛的CEM算法在判别贝叶斯网络学习中存在的缺陷,并在此基础上分别从E步和M步对CEM算法进行近似,降低了计算的复杂度,使得CEM算法在判别贝叶斯网络的学习中是有效且可行的。
     (4)在类别缺值数据的判别贝叶斯网络的学习中,针对实际问题中存在的大量类别缺值数据,研究了贝叶斯网络的半监督学习和主动学习算法。首先提出了一种生成-判别混合的半监督学习算法,应用对数联合似然函数度量无标签样本与模型的拟合程度,而应用对数条件似然函数度量有标签样本与模型的拟合程度;然后为了实现对类别缺值数据的代价敏感挖掘,提出了基于代价敏感样本选择策略的主动学习算法。
     (5)将本文相关的研究方法用于烟叶感官质量的评价中,从化学成分缺值、感官类别缺值和考虑分类代价等多个角度对烟叶感官质量进行预测和评价,为实际的卷烟生产提供了一种智能化的评价方法。
Bayesian network classifier has been widely applied in many domains. Two types of Bayesian network classifiers have been extended so that they can handle more complex problem. One type of Bayesian network classifier is extended in structure and the representational classifier is such as na?ve bayes and TAN. Other type of extended Bayesian network classifier is discriminative Bayesian networks classifier. Bayesian networks have two learning paradigms such as generative learning and discriminative learning. Generative learning of Bayesian networks have been widely researched but discriminative learning is fewly researched. In this paper discriminative Bayesian networks learning from unbalanced data, attribute missing data and label missing data are researched. The main content and fruits of this paper are outlined as follows:
     In this dissertation, generative and discriminative learning methods of Bayesian networks are reviewed and then they are compared from various points of view.
     To study discriminative Bayesian networks learning from unbalanced data, cost sensitive learning method of discriminative Bayesian networks is presented. Cost sensitive Bayesian networks take into account classification cost. In the cost sensitive parameter learning, a cost sensitive loss function is proposed and in the cost sensitive structure learning a cost sensitive criterion is used in model selection.
     To study discriminative Bayesian networks learning from attribute missing data, CEM learning method is given. A Q function that has monotonic and convergence log conditional likelihood is proposed. However, convergency CEM has some faults when it is used in discriminative Bayesian networks classifier learning. Accordingly a simple Q function is proposed to replace it. Then, in M step of CEM optimal procedure is replaced by a search procedure of gradient descent. The approximation E step and M step make CEM simpler and more effective than standard CEM.
     To study discriminative Bayesian networks learning from label missing data, semi-supervised learning and active learning methods are presented. Fistly a generative-discriminative hybrid method is studied. In hybrid method, objective function is weighted between log joint likelihood of unlabeled data and log conditional likelihood of labeled data. Then active learning based on cost sensitive sampling method is proposed. In the method, two cost reduction sampling methods are proposed.
     In the end of this dissertation, tobacco quality is evaluated by discriminative Bayesian networks. This can be taken as auxiliary means in tobacco design.
引文
[1] J. Pearl. Fusion, Propagation, and Structuring in Belief Networks [J]. Artificial Intelligence, 1986, 29 (3): 241-288.
    [2] J Pearl. Probablistic reasoning in intelligent systems: Networks of plausible infernce [M]. San Mateo,CA: Morgan Kaufmann, 1988.
    [3] R. E. Neapolitan. Learning Bayesian Networks [M]. NewYork: Pearson Prentice Hall Upper Saddle River, 2004.
    [4]张连文,郭海鹏.贝叶斯网引论[M].北京:科学出版社, 2006.
    [5] R G Cowell, A P Dawid, S L Lauritzen. Probablistic Networks and Expert Systems [M]. NewYork: Springer Verlag, 1999.
    [6] D Heckerman. Data Mining and Knowledge Discovery [J]. Data Mining and Knowledge Discovery, 1997, 1(1): 79-119.
    [7]戴剑彬慕春棣,叶俊.用于数据挖掘的贝叶斯网络[J].软件学报, 2000, 11 (5): 660-666.
    [8] D J Spigelhalter, A P Dawid, S L L auritzen. Bayesian analysis in expert systems [J]. Statistical Science, 1993, 8 (3): 219-247.
    [9] Heckerman D, J Breese, K Rommelse. Decision theoretic troubleshooting [J]. Communications of the ACM, 1995, 38 (3): 49-57.
    [10] S Hautaniemi, J Saarinen. Multitarget tracking with IMM and bayesian networks: Empirical studies [A]. Proceedings of the SPIE International Symposium on Sensor Fusion: Architectures, Alogrithms, and Applications [C]. Orlando, FL: SPIE Press, 2001, 47-57.
    [11] M Fishelson, D Geiger. Optimizing exact genetic linkage computations [A]. In the Seventh Annual International Conference on Research in Computational Molecular Biology [C]. Berlin: ACM Press, 2003, 114-121.
    [12] K Kristensen, I A Rasmussen. The use of Bayesian networks in the design of a decision support system for growing malting barley without use of pesticides [J]. Computers and Electronics in Agriculture, 2002, 33 (3): 197-217.
    [13] I Inza, P Larranaga, J Lozano, J Pena. Special Issue of machine Learning Journal: Probabilistic Graphical Models for Classification [M]. 2005, Vol59.
    [14] D. Geiger N. Friedman, M. Goldszmidt. Bayesian Network Classifiers [J]. Machine Learning, 1997, 29 (2): 131-163.
    [15]石洪波,王志海,黄厚宽.一种限定性的双层贝叶斯分类模型[J].软件学报, 2004, 15 (2): 193-199.
    [16] ChangSung Kang, Jin Tian. A Hybrid Generative/Discriminative Bayesian Classifier [A]. Proceedings of the 19th International FLAIRS Conference[C]. Florida: AAAI Press, 2006, 562-567.
    [17] D. Koller, A. Pfeffer. Object-oriented Bayesian networks [A]. In Proceedings of the 13th Conference on Uncertainty in Artificial Intelligence [C]. San Francisco: Morgan Karfman, 1997, 303-313.
    [18] O. Bangs?, P. H. Wuillemin. Object Oriented Bayesian Networks: a Framework for Top-Down Specification of Large Bayesian Networks with Repetitive Structures [R]. Aalborg: Department of Computer Science, 2000.
    [19] Helge Langseth, Olav Bangs. Parameter Learning in Object Oriented Bayesian Networks [J]. Annals of Mathematics and Artificial Intelligence, 2001, 32 (1): 221-243.
    [20] Helge Langseth, Thomas D. Nielsen. Fusion of Domain Knowledge with Data for Structural Learning In Object Oriented Domain [J]. Journal of Machine Learning Research, 2004, 4 (3): 339-368.
    [21] O. Bangs?. Object Oriented Bayesian Networks [D]. Aalborg: Aalborg University, 2004.
    [22] E. Gyftodimos, P. Flach. Hierarchical Bayesian networks: an approach to classification and learning for structured data [A]. Proceedings of the Work-in-Progress Track at the 13th International Conference on Inductive Logic Programming [C]. Zagreb: Ruder Boskovic Institute, 2003: 25-36.
    [23] Elias Gyftodimos, Peter A. Flach. Learning Hierarchical Bayesian Networks for human skill modelling [A]. UK Workshop on Computational Intelligence [C]. Bristol: University of Bristol, 2003, 55-62.
    [24] E. Gyftodimos, P. A. Flach. Hierarchical bayesian networks: A probabilistic reasoning model for structured domains [A]. Proceedings of the ICML-2002 Workshop on Development of Representations. University of New South Wales [C]. San Francisco: University of New South Wales, 2002, 23-30.
    [25] N. Friedman, L. Getoor, D. Koller, A. Pfeffer. Learning Probabilistic Relational Models [A]. Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence table of contents [C]. Stockholm: Morgan Kaufman, 1999, 1300-1309.
    [26] L. Getoor. Learning Statistical Models from Relational Data [D]. Stanford: standford university, 2002.
    [27] D. Koller. Probabilistic relational models [A]. Inductive Logic Programming, 9th International Workshop (ILP-99). Springer Verlag [C]. Berlin: Springer Verlag, 1999, 3-13.
    [28] Jennifer Neville. Statistical Models and Analysis Techniques for Learning Relational Data [D]. Amherst: University of Massachusetts Amherst, 2006.
    [29]胡玉胜.动态Bayes网络研究及其应用[D].北京:北京科技大学, 2001.
    [30]蒋国萍.软件项目风险管理的贝叶斯网络模型研究[D].长沙:国防科技大学, 2005.
    [31]周忠宝.基于贝叶斯网络的概率安全评估方法及应用研究[D].长沙:国防科技大学, 2006.
    [32] T. Dean, K. Kanazawa. A model for reasoning about persistence and causation [J]. Computational Intelligence, 1989, 5 (3): 142-150.
    [33] N. Friedman, K. Murphy, S. Russell. Learning the structure of dynamicprobabilistic networks [A]. In the Proc.14th Conf. on Uncertainty in Artificial Intelligence, [C]. 1998, 139-147.
    [34] K.P. Murphy. Dynamic Bayesian Networks: Representation, Inference and Learning [D]. Berkeley: University of Colifornia, 2002.
    [35]王飞,刘大有,卢奕南,虞强源.基于遗传算法的动态Bayesian网结构学习的研究[J].电子学报, 2003, 31 (5): 698-702.
    [36]史建国,高晓光.离散动态贝叶斯网络的直接计算推理算法[J].系统工程与电子技术, 2005, 27 (9): 1626-1630.
    [37] T. Jaakkola, D. Haussler. Exploiting generative models in discriminative classifiers [A]. Advances in Neural Information Processing Systems [C]. Cambridge: MIT Press, 1998, 11: 487-493.
    [38] T. Jaakkola, D. Haussler. Probabilistic kernel regression models [A]. Proceedings of the 1999 Conference on AI and Statistics [C]. San Francisco: Morgan Karfman, 1999.
    [39] M. E. Tipping. Sparse bayesian learning and the relevance vector machine [J]. The Journal of Machine Learning Research, 2001, 2 (1): 211-244.
    [40] B. Taskar, C. Guestrin, D. Koller. Max-margin Markov networks [A]. Advances in Neural Information Processing Systems [C]. Vancouver: 2003, 97-104.
    [41] Y. Altun, I. Tsochantaridis, T. Hofmann. Hidden markov support vector machines [A]. Proc. ICML [C]. Washington: AAAI Press, 2003, 3-10.
    [42] Y. Guo, D. Wilkinson, D. Schuurmans. Maximum Margin Bayesian Networks [A]. In Proceedings of the Twenty-First Conference on Uncertainty in Artificial Intelligence [C]. Virginia: AUAI Press, 2005, 233-242.
    [43]A. Nakamura, M. Schmitt, N. Schmitt, H. U. Simon. Inner Product Spaces for Bayesian Networks [J]. Journal of Machine Learning Research, 2005, 6: 1383-1403.
    [44]邓勇,施文康.基于条件事件代数的常概率事件模型及应用[J].上海交通大学学报, 2002, 36 (4): 588-591.
    [45] S. K. M. Wong, D. Wu. An Algebraic Characterization of Equivalent Bayesian Networks [A]. Proceedings of the IFIP 17th World Computer Congress-TC12 Stream on Intelligent Information Processing [C]. Newtherland: Kluwer, 2002, 177-187.
    [46] M. Studeny. Structural imsets: an algebraic method for describing conditional independence structures [A]. Proc of 10th Int Conf IPMU [C]. Perugia: 2004, 1323-1330.
    [47] M. Studeny, J. Vomlel. Transition between graphical and algebraic representatives of Bayesian network models (extended version) [A]. In Proceeding of the 2nd European Workshop on Probabilistic Graphical Models [C]. Leiden: 2004, 193-200.
    [48] M. Studeny. An Algebraic Approach to Structural Learning Bayesian networks [A]. 2nd international workshop data,algorithms,decisionmaking [C]. Paris: 2006,2284-2291.
    [49] L. D. Garcia. Algebraic statistics in model selection [A]. Proceedings of the 20th conference on Uncertainty in artificial intelligence [C]. Banff: AUAI Press, 2004, 177-184.
    [50] L. D. Garcia, M. Stillman, B. Sturmfels. Algebraic Geometry of Bayesian Networks [J]. Journal of Symbolic Computation, 2005, 39 (3/4): 331-355.
    [51] N. Beerenwinkel, N. Eriksson, B. Sturmfels. Conjunctive Bayesian Networks [J]. Bernoulli, 2007, 13 (4): 893-909.
    [52] V. N. Vapnik.统计学习理论的本质[M].北京:清华大学出版社, 2000.
    [53] T Roos, H Wettig, P Grunwald, H Tirri. On Discriminative Bayesian Network Classifiers and Logistic Regression [J]. Machine Learning, 2005, 59 (3): 267-296.
    [54] R Greiner, XY Su, B Shen. Structural Extension to Logistic Regression-Discriminative Parameter Learning of Belief Net Classifiers.pdf [J]. Machine Learning,, 2005, 59 (3): 297-322.
    [55] F. Pernkopf. Bayesian network classifiers versus selective k-NN classifier [J]. Pattern Recognition, 2005, 38 (1): 1-10.
    [56] D. Grossman, P. Domingos. Learning Bayesian network classifiers by maximizing conditional likelihood [A]. ACM International Conference Proceeding Series [C]. Banff Canada: ACM Press, 2004, 46-53.
    [57] J. Bilmes. Dynamic Bayesian Multinets [A]. Proceedings of the 16th Conference on Uncertainty in Artificial Intelligence table of contents [C]. San Francisco: Morgan Kaufmann, 2000, 38-45.
    [58] G. Schwarz. Estimating the dimension of a model [J]. Annals of Statistics, 1978, 6 (2): 461-464.
    [59] G. F. Cooper, E. Herskovits. A Bayesian method for the induction of probabilistic networks from data [J]. Machine Learning, 1992, 9 (4): 309-347.
    [60] D Heckerman. Bayesian networks for data mining [R]. Redmond: Microsoft Research, 1997.
    [61] M. Jordan. Learning in Graphical Models [M]. MIT Press, 1999.
    [62] N Laird A Dempster, D Rubin. Maximum likelihood from incomplete data via the EM algorithm [J]. Journal of the Royal Stiatistical Society, 1977, 39 (1): 1-38.
    [63] D Koller J Binder, S Russell, K Kanazawa. Adaptive probabilistic networks with hidden variables [J]. Machine Learning, 1997, 29 (2-3): 213-244.
    [64] D Heckerman DM Chickering. Efficient approximations for the marginal likelihood of Bayesian networks with hidden variables [J]. Machine Learning, 1997, 29 (2-3): 181-212.
    [65] F Wang DY Liu , YN Lu,WX Xue, SX Wang. Research on learning Bayesian network structure based on genetic algorithm [J]. Journal of Computer Research and Development, 2001, 38 (8): 916-922.
    [66] N Friedman. Learning belief networks in the presence of missing values and hidden variables [A]. In Proceedings of the 14th International Conference on MachineLearning,[C]. San Francisco: Morgan Kaufmann, 1997, 125-133.
    [67] N Friedman. The Bayesian structural EM algorithm [A]. In Proceedings of the 14th International Conference on Uncertainty in Artificial Intelligence [C]. San Francisco: Morgan Kaufmann, 1998, 129-138.
    [68]苑森淼王双成.具有丢失数据的贝叶斯网络结构学习研究[J].软件学报, 2004, 15 (7).
    [69] K. Huang, I. King, M. R. Lyu. Discriminative training of Bayesian Chow-Liu multinet classifiers [A]. Proceedings of the International Joint Conference on Neural Networks [C]. Portland, Oregon, 2003, 484-488.
    [70] K. Huang, Zhangbin Zhou, Irwin King, Improving Naive Bayesian Classifier by Discriminative Training [A]. Proceedings International Conference on Neural Information Processing [C]. Taipei, Taiwan, 2005.
    [71] Q Feng, F Z Tian, H K Huang. A Discriminative Learning Method of TAN Classifier [A]. ECSQARU2007 [C]. Berlin: Springer-Verlag, 2007, 443-452.
    [72] M. Pazzani. E. Keogh. Learning augmented Bayesian classifiers: A comparison of distribution-based and classification-based approaches [A]. Proceedings of 7th International Workshop on Artificial Intelligence and Statistics [C]. San Francisco: Morgan Kaufmann, 1999, 225-230.
    [73] F. Pernkopf, J. Bilmes. Discriminative versus generative parameter and structure learning of Bayesian network classifiers [A]. Proceedings of the 22nd international conference on Machine learning [C]. Bonn Germany: ACM Press, 2005, 657-664.
    [74] A.Darwiche. A differential approach to inference in Bayesian networks [A]. Proce edings of the 16th coference on uncertainty in artificial intelligence[C]. San Francisco: Morgan Kaufmann [C]. San Francisco: Morgan Kaufmann, 2000, 123-132.
    [75] S J Russell, J Binder, D Koller. Local learning in probabilistic networks with hidden variables [A]. Proceedings of the Fourteenth International Joint Conference on Artificial Intelligence [C]. San Mateo, CA: Morgan Kaufmann. 1995, 1146-1152.
    [76] G Bouchard, B Triggs. The trade off between generative and discriminative classifier [A]. In the proceedings of COMPSTAT [C]. Prague: Springer, 2004, 23-27.
    [77] A McCallum, C Pal, G Druck, X Wang. Multi-conditional learning: Generative/discriminative training for clustering and classification [A]. In proceedings of AAAI06 [C]. Boston: AAAI Press, 2006,433-439.
    [78] J A Lasserre, C M Bishop, T P Minka. Principled hybrid of generative and discriminative models [A]. Proceedings of 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition [C]. Washington DC: EEE Computer Society, 2006,87-94.
    [79] I. H. Witten, E. Frank. Data Mining: Practical Machine Learning Tools and Techniques [M]. Morgan Kaufmann, 2005.
    [80]郑恩辉.基于支持向量机的代价敏感数据挖掘研究与应用[D].杭州:浙江大学, 2006.
    [81] C Elkan. The foundation of cost sensitive learning [A]. Proceedings of the seventeenth international joint conference on Atificial Intelligence [C]. Seattle Washington: : AAAI Press, 2001, 973-978.
    [82] M Kubat, R Holte. Learning When Negative Examples Abound [A]. Proceedings of ECML-97 [C]. Czech Republic: Springer Verlag, 1997: 146-153.
    [83] G M Weiss, F Provost. Th Effect of Class Distribution on Classifier Learning: An Empirical Study [R]. New Jersey: Department of Computer Science, 2001.
    [84] Y Freund, R E Schapire. A Decision-Theoretic Generalization of on-line Learning and Application to Boosting [J]. Journal of Computer and System Sciences, 1997, 55 ( 1): 119-139.
    [85] C Drummond, R Holte. Exploiting the Cost in Sensitivity of Decision Tree Splitting Criteria [A]. Proceedings of the 17th internatioal conference on machine learning [C]. San Francisco: Morgan Kaufmann, 2000, 239-246.
    [86]凌晓峰, Sheng Victor S.代价敏感分类器比较研究[J].计算机学报, 2007,, 30 (8): 1203-1212.
    [87] E.J. Keogh, M.J. Pazzani. Learning augmented Bayesian classifiers: A comparision of distribution-based and classification-based approaches [A]. The 7th Int'1 Workshop on Artifical Intelligence and Statistics [C]. San Francisco: Morgan Kaufmann, 1999, 225-230
    [88] Tony Jebara, Alex Pentland. Maximum Conditional Likelihood via Bound Maximization and the CEM Algorithm [A]. Advances in Neural Information Processing Systems 11 [C]. Colorado, USA: The MIT Press, 1998, 494-500.
    [89] D B Rubin. Inference and missing data [J]. Biometrika, 1976, 63(3): 581-592.
    [90] S L Lauritzen. The EM algorithm for graphical association models with missing data [J]. Computational Statistics and Data Analysis, 1995, 19 (2): 191-201.
    [91] V Castelli, T M Cover. On the exponential value of labeled samples [J]. Pattern Recognition Letters, 1995, 16 (1): 105-111.
    [92] K Nigam, A K McCallum, S Thrun, T Mitchell. Text Classication from Labeled and Unlabeled Documents using EM [J]. Machine Learning, 2000, 39(2-3): 103-134.
    [93] Gregory Druck, Chris Pal, Xiaojin Zhu, Andrew McCallum. Semi-Supervised Classification with Hybrid Generative/Discriminative Methods [A]. Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining [C]. California, USA: ACM, 2007, 280 - 289 [94] H A Simon, G Lea. Problem solving and rule reduction: a unified view [J]. Knowledge and Cognition, 1974, 15(2): 63-73.
    [95] W A Gale, K W Church. A Program for Aligning Scentence in Bilingual Corpor [J]. Computational Linguistics, 1993, 19 (1): 75-102.
    [96] D D Lewis, W A Gale. A Sequential Algorithm for Training Text Classifiers [A].Proceedings of 17th ACM International Conference on Research and Development in Information Retrieval [C]. Dublin: Springer-Verlag 1994, 3-12.
    [97] W A Gale, K W Church. A Program for Aligning Sentences in Bilingual Corpora [J]. Computational Linguistics, 1993, 19 (1): 75-102.
    [98] W W Cohen, Y Singer. Context sensitive learning methods for text categorization [A]. Proceedings of the 19th Annual International ACM SIGIR Conference on Research and Development in Information Retrievall [C]. Switzerland : ACM Press, 1996, 307-315.
    [99] N Roy, A McCallum. Toward optimal active learning through sampling estimation of error reduction [A]. Proceedings of the Eighteenth International Conference on Machine Learning [C]. Williams College: Morgan Kaufmann , 2001, 441-448.
    [100]宫秀军.贝叶斯学习理论及其应用研究[D].北京:中国科学院, 2001.
    [101] E Bauer, D Koller, Y Singer. Update rules for parameter estimation in Bayesian networks [A]. In Proceedings of the Thirteenth Annual Conference on Uncertainty in Artificial Intelligence [C]. Brown University: Morgan Kaufmann 1997, 3-13.
    [102]殷勇.烟草成分对其品质影响程度的粗糙集判别方法[J].农业机械学报, 2004, 35 (4): 124-127.
    [103] A M Munoz. Sensory evaluation in quality control: An overview, new developments and future opportunities [J]. Food Quality and Preferenc, 2002, 13 (6): 329-339.
    [104]朱尊权.烟叶的可用性与卷烟的安全性[J].烟草科技, 2000, 8: 3-6.
    [105]彭黔荣,杨敏,石炎福.烟草香味物质的样品前处理和分析方法研究进展[J].香精香料化妆品, 2003, 6: 22-26.
    [106]肖协忠,李德臣,郭承芳.烟草化学[M].北京:中国农业科学技术出版社, 1997.
    [107]金闻博,戴亚.烟草化学[M].北京:清华大学出版社, 1994.
    [108] GB2635-1992.中华人民共和国国家标准.烤烟[S].
    [109] GB5606-1996.中华人民共和国国家标准.卷烟[S].
    [110] F R Jack, G M Steele. Mod elling the sensory characteristics of Scotch whisky using neural networks: a novel tool for generic protection [J]. Food Quality and Preferenc, 2002, 13 (3): 163-172.
    [111] R Dutta, E L Hines, J W Gardner. Tea quality prediction using a tinoxide-based electronic nose: an artificial intelligence approach [J]. Sensors and Actuators: B.Chemical, 2003, 94 (2): 228-237.
    [112] D Ruan, X Y Zeng. Intelligent sensory evaluation [M]. Belgium: ENSAIT Textile Institue, 2003.
    [113]陈学平,张良,郭家明等.多个化学成分指标烟叶样品的聚类分析研究[J].中国烟草学报, 2002, 8 (4): 21-26.
    [114]张志刚,王二彬,苏东赢.卷烟常规化学成分与焦油的线性回归分析[J].烟草科技, 2003, 11: 32-33.
    [115]王建民闫克玉,屈剑波,李兴波.河南烤烟评吸质量与主要理化指标的相关分析[J].烟草科技, 2001, 10: 5.
    [116]高大启,吴守一.并联神经网络在烤烟内在品质评定中的应用[J].农业机械学报, 1999, 30 (1): 58-62.
    [117] J Han, M Kamber.数据挖掘:概念与技术[M].北京:机械工业出版社, 2001.
    [118]任若恩,王惠文.多元统计数据分析-理论、方法、实例[M].北京:国防工业出版社, 1997.
    [119]杨善林,倪志伟.机器学习与智能决策支持系统[M].北京:科学出版社, 2004.
    [120]王宝华王允白,郭承芳.影响烤烟评吸质量的主要化学成分研究[J].中国农业科学, 1998, 31 (1): 89-91.
    [121]王强.基于支持向量机的卷烟叶组配方设计数据挖掘预测模型及应用研究[D].长沙:国防科技大学, 2006.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700