构造性知识发现方法研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
科学和网络技术的不断发展,数据的产生量急速增加,海量数据中知识发现成为人工智能领域研究的重要课题。决策树、神经网络、Bayesian网络等是当前知识发现的重要工具。但这些方法存在速度慢、网络结构难以确定等问题,难以满足知识发现对时效性的要求。张铃教授等在对BP等算法分析的基础上提出了基于覆盖的构造性机器学习方法,该方法根据样本自身的特点,构造神经网络,方法直观高效,较好地解决海量数据的处理问题。本文在分析当前知识发现中常用的分类方法的基础上,结合Rough集、SVM等理论,对该方法进行了深入研究,取得以下研究成果:
     (1) 基于覆盖的构造性学习方法直接根据样本数据构造覆盖网络,克服了传统神经网络计算中网络结构难以确定、运行速度慢、局部极小等问题,适宜于多类别、海量数据的处理。本文对该方法进行深入的分析,在领域构造、激励函数、距离函数等方面提出改进措施,实验证明这些改进进一步提高了覆盖算法的性能;
     (2) 学习样本的选择和学习顺序对神经网络的结构和网络的性能有直接影响,覆盖网络也与学习顺序密切相关,本文给出三种顺序覆盖方法,实验表明这些顺序覆盖方法不是最优的学习顺序,但其精度都接近或高于随机学习的平均值。在顺序覆盖的基础上,本文给出覆盖算法的增量学习和领域约简方法,有效地降低覆盖领域个数并提高覆盖网络的识别精度;
     (3) 由于描述对象的相关属性未知,现有的数据库使用大量的属性描述对象,大量冗余属性的存在,致使分类系统无法有效运行,合理选取属性特征,在保持分类能力的前提下,降低数据量,以提高分类的速度。Rough集理论为特征属性的选择提供了重要工具。本文利用Rough集方法选择属性,建立基于Rough的覆盖算法,在基本保持分类能力的前提下,提高分类的速度,并提出加权覆盖的设想。
     (4) 建立在统计学习理论基础上的SVM方法,通过映射到高维空间和最大化分类间隔,构造最优分类超平面,具有较高的泛化能力和推广能力。本文分析了SVM与覆盖算法的共性和径向基函数的特点,提出基于径向基函数的覆盖算法,实验表明这一算法可以大幅度地降低覆盖个数和拒识样本数,同时实验也表明当参数选择适当时特征空间确实现线性可分。在
    
     安徽大学博士学位论文
    商空间理论的指导下,本文提出覆盖领域溶合的概念,并给出领域的最大
    值融合和组合优化溶合的具体算法。领域溶合算法光滑了覆盖领域的分类
    边界,简化了SVM问题求解的复杂度,提高了覆盖算法的性能,将覆盖
    算法与统计学习理论结合起来,为覆盖算法提供了理论依据。
     ⑤目前分类的方法众多,如何求出个数最少的分类超平面或者说隐层
    元,一般是很困难的。本文利用样本集和超平面的对偶关系,提出求解分
    类问题的对偶算法,它将样本集和超平面投影到各自的扩充空间,用遗传
    算法的思想,给出求划分矩阵的连线搜索法,然后用粗糙集的约简方法求
    出分类问题的解域,最后用求最大间隔解的方式求出问题的最优(次优)
    解。这一方法仍须进一步完善,但为我们求解分类问题提供一个全新的方
    法和思考问题的角度,具有广阔的应用前景和丰富的研究内容。
As the development of science and network technology, the capabilities of both generating and collecting data have been increasing rapidly, so knowledge discovery from data set with huge amount of samples has become an important task of artificial intelligence. Decision tree, neural networks and Bayesian networks are the main tools of KDD. Traditional neural networks cannot satisfy the KDD's requirement that information must be supplied promptly because of the low speed of processing and the difficulty defining the structure and estimating the parameters. Professor Zhang Ling etc propose a structural method for machine learning that designs the networks with spherical domains, which cover training samples. This kind of classifications is efficient for data set with huge amount of data. The author studies this structural method combining with Rough set theory and SVM, the main work and results are the following:
    (1) Structural machine learning method based on covering domains designs networks according to sample data, which is suitable for data sets with multi-class and huge amount of samples for its efficiency. The author analyzes the algorithm and proposes some strategies of ameliorating covering domains constructing, power function and distance function. Experiments show that these ameliorations improve the performance of domains covering neural networks.
    (2)Since the selecting of samples and the learning sequence affect the performance of covering networks deeply, three sequence covering methods are given in the thesis. Although these learning orders are not the best, in our experiments, the accuracy of networks designed with sequence learning is above the average accuracy of random learning. Algorithms of incremental learning covering and covering domains pruning based on order learning are also proposed. These algorithms can reduce the number of covering domains and improve the sorting accuracy effectively.
    (3)The existing databases employ large numbers of attributes to describe
    objects for which the relative attributes are unknown. It is necessary to select
    
    
    attributes for classifier to make it perform well. Rough set theory provides an important tool for feature selection. An artificial network (RCSN) combining with rough set theory and covering design algorithm is introduced, which reduces condition attribute using rough set theory and designs the structure of neural network with covering design algorithm. An example shows that the algorithm can keep the sorting accuracy and cut down the occupying of memory and the cost of data collecting. The framework of feature weighting for covering algorithm is also proposed.
    (4) Support vector machine (SVM) that maps the samples to higher dimensional space and constructs an optimal hyper-plane to classify two classes samples based on statistical learning theory, has high abilities of generalization and extension. The resemblance between SVM and covering algorithm is analyzed and the algorithm of structural learning based on sphere covering in characteristic space is brought forward. Experiments show that this kind of networks has the virtue of both covering design algorithm and SVM; the existence of hyper-plane is proved in training processing. Directed by the theory of quotient space, the author puts forward the notion and the algorithm of covering domains fusion which band SVM and covering algorithm together. The fusion algorithm not only simplifies the solution of SVM and improves the performance of covering algorithm but also provides academic foundation for covering algorithm.
    (5) A dual representation of machine learning in classifications is introduced by mapping the samples and hyper-plane into their version space respectively. Using the dual representation, the classification problems solved in the original feature space is transformed into that solved in the characteristic space, and the classification problem of a given sample set is transformed into one that is minimal reduc
引文
[A&P1999] Andrssyov, E. Parali., J.: Knowledge Discovery in Databases-a comparison of different Views. In Zbornik Radova, In Proc. of the 10th International Conference on Information and Intelligent Systems-ⅡS'99, September 1999, Varadin, Croatia.
    [B&A1997] Breslow, L. A., and Aha, D. W. Simplifying decision trees: A survey. The Knowledge Engineering Review, 1997,12, 1 pp:1-40
    [B&D2001] Bins, J. and BA Draper. Feature Selection from Huge Feature Sets, in International Conference on Computer Vision. 2001. Vancouver: IEEE Vol Ⅱ, pp: 159-165.
    [B&G1997] Gordon Bell and James N. Gray. The revolution yet to happen. In P.J.Denning and R. M. Metcalfe, editors, Beyond Calculation, Springer Verlag, 1997, pp:5-32.
    [BFO1984] L. Breiman,J.Fried,R.Olshen and C.Stone, Classification and Regression Trees. Monterey, CA: Wadsworth International Group,1984
    [BL2000] M.J. Berry and G. S. Linoff. Mastering Data Mining: The Art and Science of Customer Relationship Management, Wiley Computer Publishing, New York, 2000.
    [BOL2002] Ieva Bolakova, Pruning Decision Trees to Reduce Tree Size. Proceedings of the international conference "Traditional And Innovations In Sustainable Development Of Society", Rezekne, Latvia, February 28-March, 2002, 2, pp.160-166,
    [C&S1997] Craven MW, Shavlik JW, Using Nearal Networks for Data Mining, Future Generation Computer Systems, 1997, 13(2—3), pp: 211~229
    [C&P2001] G. Cauwenberghs and T. Poggio Incremental and Decremental Support Vector Machine Learning,", Adv. Neural Information
    
    Processing Systems (NIPS2000), Cambridge, MA: MIT Press, vol. 13, 2001. pp: 409-415
    [C&Z2002] 程军盛,张铃,数据挖掘中基于交叉覆盖神经网络的分类分析,微机发展,12(2),2002,PP:53—54.
    [CAR1997] P. Carbone, Data Mining or Knowledge Discovery in Databases: An Overview, Data Management Handbook, New York: Auerbach Publications, 1997.
    [CHY1997] Chen, M., J. Han, and P. Yu. Data mining: An overview from database perspective. In IEEE Transactions on Knowledge and Data Engineering, 1997.
    [CVB~+2002] O. Chapelle, V. Vapnik, O. Bousquet, and S. Mukherjee. Choosing multiple parameters for support vector machines. Machine Learning, 2002.http://www. kernel-machines.org/
    [CZL2001] 崔伟东,周志华,李星,支持向量机研究,计算机工程与应用,20001,1,pp:58-61.
    [D&L1997] M. Dash, H. Liu, Feature Selection for Classification, Intelligent Data Analysis 1,1997, pp: 131-156.
    [D&R2002] D.M.J. Tax and R.P.W. Duin, Using Two-Class Classifiers for Multiclass Classification, in: R. Kasturi, D. Laurendeau, C. Suen (eds.), ICPR16, Proceedings 16th International Conference on Pattern Recognition (August 11-15, 2002, Quebec City, Canada), vol. Ⅱ, IEEE Computer Society Press, Los Alamitos, 2002, pp:124-127
    [DIE1995] Thomas G. Dietterich: Overfitting and Undercomputing in Machine Learning. ACM Computing Surveys,1995, 27(3): 326-327
    [DIE1997] T. Dietterich. Machine learning research: Four current directions. AI magazine, 1997,18(4). pp:97—136
    [DY1999] 丁夷,数据挖掘——技术与应用综述,西安邮电学院学报,1999,4(3),pp.41-44.
    [E&C1999] A. Engelbrecht and I. Cloete. Incremental learning using sensitivity analysis. In. IEEE IJCNN, Washington DC, 1999.
    
    
    [F&S1995] Freund Y, Schapire R E, A Decision Theoretic Generalization of On-line Learning and Application to boosting, Proc. of the second European conf on computational learning.
    [FDG1997] Floriana Esposito, Donato Malerba, Giovanni Semeraro, A Comparative Analysis of Methods for Pruning Decision Trees, IEEE Transactions on Pattern Analysis and machine Intelligence, 1997,19(5), 476-491
    [FPM1991] W. J. Frawley, G. Piatetsky-Shapiro, and C. J. Matheus. Knowledge discovery in databases: An overview. In G. Piatetsky-Shapiro and W. J. Frawley, editors, Knowledge Discovery in Databases, pp.1-27. AAAI/MIT Press, 1991.
    [FPS1996] Fayyad, U., Piatetsky-Shapiro, G., and Smyth, R. The KDD Process for Extracting Useful Knowledge from Volumes of Data, Communications of the ACM, 1996, 39(11), pp. 27-34.
    [FPS~+1996] U.M.Fayyad, Piatetsky-Shapiro, P. Smyth, and R.Uthurusamy, editors, Advances in Knowledge Discovery and Data Mining, Cambridge, MA: AAAI/MIT Press, 1996.
    [FU1996] L. Fu, Incremental Knowledge Acquisition in Supervised Learning Networks, IEEE Trans. on SMC. Part A, Systems and Humans, Vol. 26, No. 6, pp. 801-809, Nov. 1996.
    [GLS2002] 宫秀军,刘少辉,史忠植,一种增量贝叶斯分类模型,计算机学报,2002,25(6)。
    [H&K2001a] Jiawei Han and Micheline Kamber, Data Mining: Concepts and techniques, Morgan Kaufmann Publishers, 2001.
    [H&K2001b] 韩家炜(加),坎伯(加)著,范明等译,数据挖掘:概念与技术,北京:机械工业出版社,2001.8.
    [H&L2002a] C.W. Hsu and C.J. Lin. A simple decomposition method for support vector machines, Machine Learning, 2002,46,291-314.
    [H&L2002b] C.-W. Hsu and C.-J. Lin. A comparison on methods for multi-class support vector machines, IEEE Transactions on
    
    Neural Networks, 13(2002), 415-425.
    [HCT2002] Yi-Chung Hu, Ruey-Shun Chen, and Gwo-Hshiung Tzeng, Generating Learning Sequences for Decision Makers Through Data Mining and Competence Set Expansion,IEEE Transactions on Systems, Man and Cybernetics. Part B. Cybernetics,2002,32(5), 679-685
    [HDB2002] Martin T. Hagan,Howard B.Demuth,Mark Beale,神经网络设计(英文影印版),机械工出版社,2002.9.
    [HEC1995] Heckman D., A Bayesian approach for learning causal networks. In:Besnard P, Hanks Seds. Proceedings of the 11th Conference on Uncertainty in Artificial Intelligence. SanFrancisco, CA: Morgan Kaufmann Publishers, Inc., 1995. pp:285-295.
    [HEC1997] David Heckerman. Bayesian networks for data mining. Data Mining and Knowledge Discovery, 1997,1,79~119
    [HEG2003] Markus Hegland, Data Mining-Challenges, Models, Methods and Algorithms, http://datamining.anu.edu.au/publications.html
    [HGC1995] Heckman D., Geiger D., Chickering D., Learning Bayesian networks: the combination of knowledge and statistical data. Machine Learning, 1995, 20(3), pp:197-243.
    [HMS1966] Hunt E.B., J Marin, P T Stone, Experiments in Induction, Academic Press,1966.
    [HMW1995] Heckman D., Mandani A., Wellman M., Real-World applications of Bayesian networks. Communications of the ACM, 1995,3 8(3), pp: 38-45
    [HOL1993] Holte RC. Very simple classification rules perform well on most commonly used datasets. Machine Learning 1993;11:63-91.
    [HYX2001] 黄绍君,杨炳儒,谢永红,知识发现及其应用研究回顾,计算机应用研究,2001,18(4),pp1-5,8.
    
    
    [HZW1996] 韩祯祥,张琦,文福拴.粗糙集理论及其应用综述,控制理论与应用,1996,16(2):153—157.
    [JAC2002] Jackson, J., Data Mining: A Conceptual Overview, Communications of the AIS, Vol. 8, 2002, pp. 267-296.
    [KGC~+1997] M. Kamber, L. Winstone, W. Gong, S. Cheng, and J. Han. Generalization and decision tree induction Efficient classification in data mining. In Proc. of 1997 BIBLIOGRAPHY 95 Int. Workshop on Research Issues on Data Engineering (RIDE'97), pages 111—120, Birmingham, England, April 1997
    [KLE1999] M. Klemettine. A Knowledge Discovery Methodology for Telecommunication Network Alarm Databases. PhD thesis, Department of Computer Science, University of Helsinki Finland, January 1999. Series of Publications A, Report A-1999-1.
    [KON2001] Konstantinos Veropoulos, Machine Learning Approaches to Medical Decision Making. PhD thesis, Department of Computer Science, University of Bristol, March 2001
    [KPP~+1999] Komorowski, J., Pawlak, Z., Polkowski, L., and Skowron, A. Rough sets: A tutorial. In Pal, S. and Skowron, A., editors, Rough Fuzzy Hybridization, pages 3—98. Springer—Verlag.(1999)
    [L&G2000] 李德仁,关泽群,空间信息系统的集成与实现,武汉测绘科技大学出版社,2000.4。
    [L&M2002] 柳回春,马树元,支持向量机的研究现状,中国图形图像学报,7A(6),2002,PP:618—623.
    [L&Y2002] Lin, Y.T. and Yao, Y.Y., Induction of Classification Rules by Granular Computing, Rough Sets and Current Trends in Computing, Proceedings of the Third International Conference (RSCTC 2002), Lecture Notes in Artificial Intelligence 2475, 2002, pp. 331-338.
    [LCC2002] 刘红岩,陈剑,陈国青,据挖掘中的数据分类算法综述,清华大学学报(自然科学版),2002,42(6),PP.727—730。
    [LCX2000] 李业丽,常桂然,徐茜,神经网络在数据挖掘中的应用研究,
    
    计算机工程与应用,2000.8,pP:103—105.
    [LIU2001] 刘清,Rough集及Rough推理,科学出版社,2001。
    [M&H1999] 苗夺谦,胡桂荣,知识约简的一种启发式算法,计算机研究与发展,1999,36(6):681—684。
    [M&W1997] 苗夺谦,王珏.基于粗糙集的多变量决策树构造方法,软件学报,1997,8(6):425-431
    [MA02001] 苗夺谦,Rough Set理论中连续属性的离散化方法,自动化学报,2001,27(3),296-302。
    [MAR1996] M. Mehta, R. Agrawal, and J. Rissanen. SLIQ: a fast scalable classifier for data mining, Proc. 1996 Intl. Conf. On Extending Database Technology(EDBT 96), Avignon, France, 1996.
    [MDY2000] 慕春棣,戴剑彬,叶俊,用于数据挖掘的贝叶斯网络,软件学报 2000,11(5)pp:660~666
    [MIT2003] Tom M. Mitchell著,曾华军,张银逵等译,机器学习,北京:机械工业出版社,2002,1。
    [MOD1993] Modrzejewski, M., Feature selection using rough sets theory, in P. B. Brazdil, ed., Proceedings of the European Conference on Machine Learning, Springer, 1993, pp. 213—226.
    [MYG2001] 糜元根,数据挖掘方法的讦述,南京化工大学学报,23(5),2001,
    [NGU1996] Nguyen SH, Some Efficient Algorithms for Rough Set Methods, Proc. of the Conf. of Information Processing and Management of Uncertainty in Knowledge Based Systems, Granada, Spain, 1996: 1451-1456.
    [PAW1991] Pawlak A, ROUGH SETS. Theoretical Aspects of Reasoning about Data. Kluwer Academic Pub., 1991.
    [PEA1988] J. Pearl. Probabilistic Reasoning in Intelligent Systems.Pallo Alto, CA: Morgan Kaufmann,1988.
    [PUU+2001] Polikar, R., Udpa, L., Udpa, S., and Honavar, V. Learn++: An Incremental Learning Algorithm for Multi-Layer Perceptron Networks. IEEE Transactions on Systems, Man, and Cybernetics, 2001, 31(4). pp. 497-508.
    
    
    [QUI1979] J. R. Quinlan, Discovering Rules from Large Collections of Examples: A Case Study. In: Michie D, ed. Expert Systems in the Micro Electronic Age, Edinburgh University Press
    [QUI1985] J. R. Quinlan, Generating Production Rules from Decision Trees, Proceedings of IJCAI-87,Milan,Italy.
    [QUI1986] J. R. Quinlan, Induction of Decision Trees, Machine Learning, 1986, 1, pp: 81-106,.
    [QUI1993] J.R. Quinlan, C4.5: Programs for Machine Learning, SanMateo, California: Morgan Kaufmann, 1993.
    [QWY~+1999] 权光日、文光远、叶风、陈晓鹏,连续属性空间上的规则学习算法,软件学报,1999,10(11):1225—1232。
    [QXQ2002] 羌磊,肖田元,乔桂秀,一种改进的Bayesian网络结构学习算法,计算机研究与发展,2002,39(10),PP:1221—1226.
    [R&K2002] Rajagopalan, B., Krovi, R., Benchmarking Data Mining Algorithms, Journal of Database Management, Jan-Mar, 2002,13, 25-36
    [SAK+1998] Steven Salzberg, Arthur L. Delcher, Kenneth H. Fasman, John Henderson: A Decision Tree System for Finding Genes in DNA. Journal of Computational Biology 1998,5(4),pp: 667-680.
    [SAM1996] J. Sharer, R. Agrawal, M. Mehta. SPRINT: A Scalable parallel classifier for data mining. Proceedings of the 22nd VLDB Conf. Mumbai(Bombay), India, 1996.
    [SHI2002] 史忠植,知识发现,北京:清华大学出版社,2002.1
    [SLS1999] Syed, N., Liu, H. and Sung, K., Incremental Learning with Support Vector Machines In Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI-99),Workshop on Support Vector Machines, August 1999
    [TAO2002] 陶品,构造型人工神经增量网络学习算法研究,清华大学计算机系博士学位论文,2002。
    [THI1995] B. Tiesson, Accelerated Quantification of Bayesian Networks
    
    with Incomplete data, in the First International Conference on Knowledge Discovery and Data Mining(U.Fayyad and R.Uthurusamy, eds.), (Montreal, Quebec, Canada),1995, pp:306-311
    [TZY2002] 陶品,张钹,叶榛,构造型神经网络双交叉覆盖增量学习算法,软件学报,2003,14(2),pp:1221—1226
    [UBC1997] P.E. Utgoff, N.C. Berkman and J.A. Clouse, Decision Tree Induction Based on Efficient Tree Restructuring, Machine Learning Journal, 1997, Vol. 29, pp. 5-44,.
    [UTG1988] Paul E. Utogoff, Id5: An Incremental ID3, Proccedings of the fifth international workhop on machine learning, University of Michigan, 1988
    [YAP1990] Valdimir N.Vapnik, An Overview of Statistical Learning Theory,IEEE Transaction on Neural Networks, 10(5),1999,988-999.
    [VAP2000] Valdimir N.Vapnik著,张学工译,统计学习理论的本质,北京清华大学出版社,2000.9.
    [V&M2002] S.V.N. Vishwanathan, M. Narasimha Murty, Geometric SVM: A Fast and Intuitive SVM Algorithm, in: R. Kasturi, D. Laurendeau, C. Suen (eds.), ICPR16, Proceedings 16th International Conference on Pattern Recognition (August 11-15, 2002, Quebec City, Canada), vol. Ⅱ, IEEE Computer Society Press, Los Alamitos, 2002, 56-59.
    [W&S2002] 卫保国,沈兰荪,支撑矢量的精简,神经网络与计算智能,浙江大学出版社,2002,10,PP:124-129,
    [WYZ+2000] 王英林,刘溪涓,张申生等,基于粒度分层的布局算法,上海交通大学学报,2000,34(7):868~872.
    [W&Z1998] 吴福朝,张铃,基于FP算法的神经网络综合方法,小型微型计算机系统,1998.1,19(1):68~71.
    [W&Z2001] 王国胜,钟义信,支持向量机的若干新进展,电子学报,29(10),2001,pp:1397-1440;
    [WCW2000] 王玮,陈恩红,王煦法,基于贝叶斯方法的知识发现,小型微
    
    型计算机系统,2000,21(7).
    [WGY2001] 王国胤,Rough集理论与知识获取,西安交通大学出版社,2001。
    [WMZ1996] 王珏,苗夺谦,周育健.关于RoughSet理论与应用的综述,模式识别与人工智能,1996,9:337—344。
    [WU2000] 吴鸣锐,大规模模式识别问题的分类器设计研究,北京,清华大学计算机系博士学位论文,2000.
    [WWM~+1998] 王珏,王任,苗夺谦等,基于Rough Set理论的“数据浓缩”,计算机学报,1998,21(5):393—400
    [WWS2002] 一种新的多类模式识别支持向量机,模式识别与人工识别,2002.6,15(2),pp:178—181.
    [Y&Z1999] Yao, Y.Y. and Zhong, N., Potential applications of granular computing in knowledge discovery and data mining, Proceedings of World Multiconference on Systemics, Cybernetics and Informatics, Volume 5, Computer Science and Engineering, Orlando, Florida, USA, July 31 August 4, 1999, Torres, M., Sanchez, B. and Aguilar, J. (Ed.), International Institute of Informatics and Systematics, Orlando, pp. 573-580.
    [YAO1999] Yao, Y.Y., Rough sets, neighborhood systems, and granular computing, Proceedings of the 1999 IEEE Canadian Conference on Electrical and Computer Engineering}, Edmonton, Canada, May 9-12, 1999, Meng, M. (Ed.), IEEE Press, pp. 1553-1558.
    [YIN2002] 殷培,文本自动分类中的支持向量机主动学习,清华大学综合论文训练,2002。
    [YWL1997] Yao, Y.Y., Wong, S.K.M., and Lin, T.Y., A review of rough set models, in: Rough Sets and Data Mining: Analysis for Imprecise Data, Lin, T.Y. and Cercone, N. (Eds.), Kluwer Academic Publishers, Boston, pp. 47-75, 1997.
    [YZW~+2003] 叶少珍,张钹,吴鸣锐,郑文波,一种基于神经网络覆盖构造法的模糊分类器,软件学报,14(3):429—434,2003
    [Z&C2002] 周志华,陈世福,神经网络规则抽取,计算机研究与发展,2002,39(4),pp:398—405.
    
    
    [Z&W2000] 张文修,吴伟志,粗糙集理论介绍和研究综述,模糊系统与数学,14(2),2000.12,pp:01—12.
    [Z&Z1990] 张铃,张钹,问题求解理论及应用,北京:清华大学出版社,1990.12。
    [Z&Z1994] 张铃,张钹,神经网络中BP算法分析,模式识别与人工智能,1994,7(3),PP:191-195.
    [Z&Z1997a] 张铃,张钹,人工神经网络理论及应用,杭州:浙江科学技术出版社,1997.
    [Z&Z1997b] 张铃,张钹,多层反馈神经网络的即学习和综合算法,软件学报,1997,8(4):252~258.
    [Z&Z1998a] 张铃,张钹,神经网络学习中“附加样本”的技术,软件学报,1998,9(5):371~377.
    [Z&Z1998b] 张铃,张钹.M-P神经元模型的几何意义及其应用.软件学报,1998,9(5):334~338
    [Z&Z1999] Ling Zhang, Bo Zhang, A Geometrical Representation of McCuiloch-Pitts Neural Model and Its Applications, IEEE Trans. on Neural Networks Vol.10, No.4, July 1999: 925-929.
    [ZHA1994] Zhang, BT, Accelerated Learning by Active Example Selection, International Journal of Neural Systems, vol. 5, no. 1, pp. 67-75,1994.
    [ZHA2001] 张铃,支持向量机理论与基于规划的神经网络学习算法,计算机学报,24(2),2001,pp:113—118.
    [ZHA2002] 张铃,机器学习中的覆盖算法与核函数法,第十二届全国神经计算学术大会论文集,北京,2002,pp:78—83
    [ZHW~+2000] 朱绍文,胡宏银,王全德,张大斌等,决策树采掘技术及发展趋势,计算机工程,2000,26(10),pp:1-2,35
    [ZLQ~+2002] 周晓宇,李慎之,戚晓芳,徐宝文,数据挖掘技术初探,小型微型计算机系统,2002,23(3),pp.342-346。
    [ZWW+2002] 赵军,王国胤,吴中福,李华,基于粗集理论的数据离散化新算法,重庆大学学报(自然科学版),2002,25(3),pp:18—21
    
    
    [ZWX2002] 张燕平,吴涛,徐峰,王伦文,张曼,张铃,构造性学习算法(SLA)在股票预测中的应用,第十二届全国神经计算学术大会论文集,北京,2002,pp:640-645.
    [ZWZ~+1995] 张铃,吴福朝,张钹等,多层前馈神经网络的学习和综合算法,软件学报,1995,6(7):440~448.
    [ZXG2000] 张学工,关于统计学习理论与支持向量机,自动化学报,26(1),2000,pp:32-42.
    [ZYP2002] 张燕平,提取特征规则的重复覆盖算法(RCA),安徽大学学报(自然科学版),2002,26(2):9~13.
    [ZZW1995] Bo Zhang, Ling Zhang(张铃), Fuchao Wu, Programming based learning algorithm of neural networks with self-feedback connection IEEE trans, on Neural Networks, Vol.6, NO.3, 1995, pp:771-775.
    [ZZW2002] 张铃,张钹,吴涛,基于核函数的溶合覆盖,安徽大学人工智能所技术报告,2002。
    [ZZW2003] 张燕平,张铃,吴涛,机器学习中的多侧面递进算法,电子学报,在投。
    [ZZY1999] 张铃,张钹,殷海风,多层前向网络的交叉覆盖设计算法,软件学报10(7),1999,pp:737—742

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700