用户名: 密码: 验证码:
基于主动差异学习神经网络集成的电力变压器故障诊断方法研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
变压器是电力系统最重要的设备之一,它的运行状态直接关系到电力系统的安全运行,通过监测变压器潜伏性故障的发生以及判断故障发展的程度,可以预防由此引发的重大事故。由于变压器发生故障时,其油中气体的成分和组成含量与变压器的故障类别和严重程度密切相关,因此,监测变压器油中气体变化及其发展趋势,是目前最常用的变压器监测方法。实际上变压器气体征兆与故障原因之间存在复杂性和模糊性,难以建立精确的数学模型,是一个非常困难的学习问题,靠单一的分类器和测量手段难以解决。而作为多分类器集成的神经网络集成方法可以对复杂问题采取“分而治之”的策略,在变压器故障诊断中有天然的优势和巨大的应用前景。本文研究的个体差异性集成学习通过网络设计的主动性和并行学习的协同性,解决个体网络生成的随机性和盲目性,极大降低集成中网络的冗余和规模,是一类适合工程化应用的高效神经网络集成方法。
     传统神经网络集成中个体网络生成中存在随机性和盲目性,网络训练通常独立进行,缺乏协同,难以保证个体网络的差异。通过对神经网络集成误差公式的理论分析,提出了一种能主动引导个体网络进行差异性学习的集成网络学习算法。该方法通过对集成误差的分解,使个体网络的训练准则函数中包含个体网络误差相关度的因素,并通过协同训练,引导个体网络进行差异性学习。基于此,本文提出差异性神经网络集成对变压器油中溶解气体的诊断模型,实验结果表明,相同条件下,主动差异性神经网络集成方法对参与集成的个体网络分类精度要求不太高,能显著地提高集成学习系统的稳定性和泛化能力。该方法的故障诊断准确率优于传统的三比值法与BP神经网络,其性能也比经典的集成方法Bagging和Boosting方法更稳定可靠。
     提出一种输出数据修正的新型差异性神经网络集成方法,通过修正输出目标数据引导个体网络的差异性学习,提高集成网络的泛化性能。主动差异性神经网络集成方法中成员网络均采用反向传播算法,限制了算法在不同分类器中的广泛应用,而新型差异性集成方法还可以集成多层感知器,径向基函数网络等异构网络。最后讨论了集成网络训练需要的通信代价,分析表明,新型差异性方法明显降低了网络的通信成本,是一种更高效的利于工程应用的神经网络集成方法。
     本文从个体网络生成的主动和协同性出发,基于网络个体负相关差异评价,研究了两种主动引导个体网络进行差异性集成学习的方法,并将其应用在电力变压器的故障诊断中,取得了良好的诊断效果。
Power transformer is one of the most important equipment in power system. Its operational status has greate importance to the system’s safety. By monitoring the occurrence of latent failures in the power transformers and their progression, it can help to prevent the happence of major accidents. The gas’s content and composition in oil are highly related to the fault type and the fault level when there are accidents in the power transformer. Therefore, it is in common use for transformer’s diagnosis technology through analysing the gases and their change. Actually, there are complexity and uncertain mapping relationship between gas content and fault category, so it is a very difficult learning problem to build accurate mathematical model for power transformer fault diagnosis. And it is hard to resolve just relying on a single classifier. As a multi-classifier method, neural network ensemble has a natural advantage and greate applications foreground in transformer fault diagnosis by decomposing the learning task of into a number of subtasks for different individual networks. The diversity ensemble learning can resolve the randomness and blindness of individual networks generation and greatly reduce the networks’redundancy, through strengthening the activeness in designing networks and cooperation in parallel learning. And it is an effective neural network ensemble method for engineering application.
     The individual networks traning is independent and lack of cooperation in the traditional neural network ensemble method, due to randomness and blindnessm, it is difficult to ensure the diversity of the networks. An ensemble learning algorithm is proposed here by analyzing the error function of neural network ensembles, in which individual neural nerworks are actively guided to learn diversity. By decomposing the ensemble error function, error correlation terms are included in the learning criterion function of individual networks. And all the individual networks in the ensemble are leaded to learn diversity through cooperative training. Based on this idea, Active diverse learing(ADL) method is applied to fault diganosis of power transformer base on Dissolved Gas Analysis. The experimental results show that, under the same conditions, it doesn’t require the individual network with high classification accuracy, but it can significantly improve the system’s stability and generalization. This algorithm has higher accuracy than IEC method and BP network. And the performance is more stable than conventional ensemble method, i.e., Bagging and Boosting.
     A new divesity neural network ensemble called output-corrected data(OCD) is proposed in this paper. It differs from all previous ADL methods that instead of modifying every component network’s error function to incorporate error correlation information, OCD modified the training data and creates sets of output-corrected data (OCD) as new training data, which induce diverse learing when component networks are trained on them. Previous ADL method developed can only assemble networks using backpropagation training algorithm, and demands prohibitively high communication bandwith(between component networks) that hinders parallel processing speed. In addition, every component network must be reprogrammed to include the error correlation terms in the training objective function, which raises difficulties to use the third party codes. These drawbacks significantly limit the practical application of ADL method for different classifiers. Thus, without the requirement of recoding each component network, new diverse learning method is simple to implement and can be used for assembling heterogenous networks, e.g., multilayer perceptrons and radial basis functions. Another major advantage is that OCD significantly reduces network communication bandwidth, making parallel processing more effective. The analysis indicates that OCD significantly reduces network communication cost. It is an effective neural network ensemble method for engineering application.
     By introducing activeness and cooperation and based on negative correlation diversity assessment, this paper studies two approachs to generate individual networks in ensemble,. Both of them were applied in fault diagnosis of power transformers, and achieved good results.
引文
[1]李清泉,王伟,王晓龙.利用DGA-NN诊断油浸式电力变压器故障[J].高电压技术, 2007. 33(8): 48-51.
    [2]国家标准局. GB7252-87变压器油中溶解气体分析和判断导则[S].北京中国标准出版社, 1987.
    [3]中国电工技术学会编.电工高新技术丛书第5分册[M].北京:北京机械工业出版社, 2001.
    [4]中华人民共和国电力行业标准,变压器油中溶解气体分析和判断导则[S]. DL/T722-2000中国人民共和国国家经济贸易委员会, 2000-11-03标准.
    [5]孙才新,陈伟根,李俭等著.电气设备油中气体在线监测与故障诊断技术[M].北京:科学出版社, 2003.
    [6] Febriyanto A, Saba T.Oil-immersed power transformers condition diagnosis with limited Dissolved Gas Analysis(DGA) data[C].Universities Power Engineering Conference, Sydney, NSW, 2008.
    [7] Xiong H, Sun C.Artificial immune network classification algorithm for fault diagnosis of power transformer[J]. IEEE Transactions on Power Delivery, 2007,22(2):930-935.
    [8] Vanegas O, Mizuno Y et al. Diagnosis of oil-insulated power apparatus by using neural network simulation[J]. IEEE Transactions on Dielectrics and Electrical Insulation, 1997,4(3):290-299.
    [9] Zhang Y, L iu Y, et al. An artificial neural network approach to transformer fault diagnosis[J]. IEEE Trans. on PowerDelivery, 1996,11(4):1836–1841.
    [10] Yan-jing SUN, Shen ZHANG, Chang-xin MIAO and Jing-meng LI. Improved BP Neural Network for Transformer Fault Diagnosis[J], Journal of China University of Mining and Technology, 2007,17(1): 138-142.
    [11]潘翀,陈伟根,云玉新,杜林,孙才新.基于遗传算法进化小波神经网络的电力变压器故障诊断[J].电力系统自动化, 2007,31(13):88-92.
    [12] Hongsheng Su, Qunzhan Li. Fuzzy Neural Classifier for Transformer Fault Diagnosis Based on EM Learning[R]. Lecture Notes in Computer Science Springer Berlin Heidelberg, 2006,4114:222-229.
    [13]王涛,王晓霞.基于改进PSO-BP算法的变压器故障诊断[J].中国电力, 2009,42(5):13-16.
    [14]孙才新,陈根伟,李俭等.电气设备油中气体在线监测与故障诊断技术[M].北京:科学出版社, 2003.
    [15]杜文霞,句希源,吕锋.基于模糊聚类算法的变压器故障诊断研究[J].变压器, 2009,46(8):65-69
    [16]厉劼翀,周宁,吕彬.基于神经网络、模糊理论的变压器油中溶解气体诊断专家系统[J].电网技术, 2006,30:125-128.
    [17]张全寿,周建峰.专家系统建造原理及方法[M].北京:中国铁道出版社, 1992.
    [18]魏鲁原,崔霞.专家系统在变压器故障诊断中的应用[J].机床与液压, 2007,35(7):251-253.
    [19]吴晓辉,刘炯,梁永春,汪晓明,李彦明.支持向量机在电力变压器故障诊断中的应用[J].西安交通大学学报, 2007,41(6):722-726.
    [20]吕干云,程浩忠,董立新,翟海保.基于多级支持向量机分类器的电力变压器故障识别[J].电力系统及其自动化学报, 2005,17(1):19-22.
    [21] Dietterich T G. Machine learning research: four current directions[C]. Artificial Intelligence, 1997, 18(44): 97-136.
    [22] Hansen L K, Salamon P. Neural network ensembles[J]. IEEE Trans Pattern Analysis and Machine Intelligence, 1990,12(10):993-1001.
    [23] Sollich P, Krogh A. Learning with ensembles: How over fitting can be useful[M]. In: Touretzky D, Mozer M, Hasselmom eds. Advances in Neural Information Processing Systems 8[M]. Cambridge, MA: MIT Press, 1996:190-196.
    [24] Krogh A , Vedelsby J. Neural network ensembles, cross validation, and active learning. In: Tesauro G, Touretzky D, Leen T eds. Advances in Neural Information Processing Systems 7[M]. Cambridge, MA: MIT Press, 1995:231-238.
    [25] Schapire R E. The strength of weak learnability[J]. Machine Learning, 1990,5(2):197-227.
    [26] Breiman L. Bagging predictors[J]. Machine Learning, 1996,24(2):123- 140.
    [27] Freund Y. Boosting a weak algorithm by majority[J]. Information and Computation, 1995,121(2): 256-285.
    [28] Freund Y, Schapire R E. A decision-theoretic generalization of on-line learning and an application to boosting[J]. Journal of Computer and System Sciences, 1997,55(1):119-139.
    [29] Bauer E, Kohavi R. An empirical comparison of voting classification algorithms: Bagging, boosting, and variants[J]. Machine Learning, 1999,36(1-2):105-139.
    [30] Raviv Y, Intrator N. Bootstrapping with noise: an effective regularization technique[J]. Connection Science, 1996,8:355-372.
    [31] Cunningham P, Carney J. Diversity versus quality in classification ensembles based on feature selection[C]. in: R.L. deMántaras, E. Plaza (eds.), Proc. ECML 2000 11th European Conf. On Machine Learning, Barcelona, Spain, LNCS 1810, Springer, 2000:109-116.
    [32] Opitz D. Feature selection for ensembles[C]. in: Proc. 16th National Conf. on Artificial Intelligence, AAAI Press, 1999:379-384.
    [33] Aha D W, Bankert R L. A comparative evaluation of sequential feature selection algorithms[C]. in: Fisher D, Lenz H. (Eds.), Proc. 5th Int. Workshop on Artificial Intelligence and Statistics, 1995:1-7.
    [34] Bryll R, Gutierrez-Osuna R, Quek F. Attribute bagging: improving accuracy of classifier ensembles by using random feature subset[J]. Pattern Recognition, 2003, 36(3):1291-1302.
    [35]凌锦江,陈兆乾,周志华.基于特征选择的神经网络集成方法[J].复旦学报(自然科学版), 2004,43(5):685-688.
    [36]凌锦江,周志华.基于因果发现的神经网络集成方法[J].软件学报, 2005,15(10):1479-1484.
    [37]张东波,王耀南.基于粗糙集约简的神经网络集成及其遥感图像分类应用[J].中国图像图形学报, 2008,13(3):480-487.
    [38] Zhou Z H, Wu J X, Tang W. Ensembling neural networks: many could be better than all. Artificial Intelligence[J]. 2002, 137(1-2): 239-263.
    [39]吴建鑫,周志华,沈学华,陈兆乾.一种选择性神经网络集成构造方法[J].计算机研究与发展, 2000,37(9):1039-1044.
    [40]傅强,胡上序,赵胜颖.基于PSO算法的神经网络集成构造方法[J].浙江大学学报(工学版), 2004,38(12):1596-1600.
    [41] Bakker B, Heskes T. Clustering ensembles of neural network models[J]. Neural Networks 2003, 16: 261-269.
    [42] Fu Q, Hu S X, Zhao S Y. Clustering-based selective neural network ensemble[J]. Journal of Zhejiang University Science, 2005 6A(5): 387-392.
    [43]李凯,黄厚宽.一种基于聚类技术的选择性神经网络集成方法[J].计算机研究与发展, 2005, 42(4):594-598.
    [44]唐伟,周志华基于Bagging的选择性聚类集成[J].软件学报,2005,6(4):496-502.
    [45] Zhu B Z, Lin J. A novel feature extraction-based selective & nonlinear neural network ensemble model for economic forecasting, International Journal of Computer Science and Network Security[J]. 2007,7(2):142-145.
    [46] Wu J S, Wang L Z, Zhu B X. The meteorological prediction model study of neural ensemble based on PSO algorithms[C]. Proceedings of the 6th World Congress on Intelligent Control and Automation, Dalian, China, 2006.
    [47] Liu Y, Yao X. Evolving neural network ensembles by fitness sharing[C]. 2006 IEEE Congress on Evolutionary Computation Sheraton Vancouver Wall Centre Hotel, Canada, Vancouver, BC, 2006.
    [48] Kim K J, Cho S B. Evolutionary ensemble of diverse artificial neural networks using speciation[J]. Neurocomputing, 2007,29(2):101-115.
    [49] Pedrajas N G, Martínez C H, and Boyer D O. Cooperative coevolution of artificial neural network Ensembles for Pattern Classification[J]. IEEE Transactions on Evolutional Computation, 2005,9(3): 271-302.
    [50] Liu Y, Yao X. Ensemble learning via negative correlation[J]. Neural Networks, 1999,12:1399–1404.
    [51] Liu Y, Yao X, Higuchi T. Evolutionary ensembles with negative correlation Learning[J]. IEEE Transaction on Evolutionary Computation, 2000,4(4):380-387.
    [52] Zeke S. H. Chan and Nik Kasabov Fast neural network ensemble learning via negative-correlation data correction[J]. IEEE Transactions on Neural Networks, 2005,16(6):1707-1710.
    [53] Kazi Md. Rokibul Alam, Md. Monirul Islam. Combining Boosting with negative correlation learning for training neural network ensembles[C]. International Conference on Information and Communication Technology, Dhaka, Bangladesh, 2007:68-71.
    [54]傅向华,冯博琴,马兆丰,韩冰.一种异构神经网络集成协同构造算法[J].小型微型计算机系统, 2005,26(4):41-645.
    [55] errone M P,CooPer L N. .When networks disagree:Ensemble method for neural networks.In: Mammone R J ed. Artificial Neural Networks for Speech and Vision[M]. New York: Chapman&Hall, 1993:126-124.
    [56] Opitz D, Shavlik J. Actively searching for an effective neural network ensemble[J]. Connection Science, 1996,8(3-4):337-353.
    [57]王正群,陈世福,陈兆乾.优化分类型神经网络线性集成[J].软件学报, 2005,16(11):1902-1908.
    [58]边肇祺,张学工.模式识别(第二版)[M].北京:清华大学出版社, 2002.
    [59]黄鞠铭. BP网络在基于DGA变压器故障诊断中的应用[J].高电压技术, 1996.(2):21-23.
    [60]李建坡.基于油中溶解气体分析的电力变压器故障诊断技术的研究[D].吉林大学博士学位论文, 2008.
    [61]肖燕彩,陈秀海,朱衡君.遗传支持向量机在电力变压器故障诊断中的应用[J].上海交通大学学报, 2007,41(11):1879-1881.
    [62]朱大奇,史慧.人工神经网络原理及应用[D].北京:科学出版社, 2006.
    [63]肖健华.智能模式识别方法[M].广州:华南理工大学出版社, 2005.
    [64]哈根.神经网络设计[M].北京:机械工业出版社, 2002,9:9-13,143-145,201,228.
    [65] Gallant S.I.. Perceptron based learning algorithms[J]. IEEE Transactions on Neural Network, 1990,1(2):179-191.
    [66]李国勇.智能控制及其MATLAB实现[M].北京:电子工业出版社, 2005.
    [67] Kuncheva L, Whitaker C. Measures of Diversity in Classifier Ensembles and Their Relationship with Ensemble Accuracy[J]. Machine Learning, 2003,51(2):181-207.
    [68] Alexey T, Mykola P, Pádraig C. Diversity in Ensemble Feature Selection[R]. Technical report TCD-CS-2003-44, Trinity. College Dublin, Ireland, 2003:1-38.
    [69] B. E. Rosen. Ensemble learning using decorrelated neural networks[J]. Connection Sci., 1996,8:373-384.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700