思维进化和支持向量机理论及其在炼焦配煤优化中的应用研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
由两种或两种以上的煤按一定比例配合炼焦的工艺称为配煤炼焦。由于煤的组成、结构及性质的复杂性,使人们对煤及其焦化过程的认识具有局限性。现有的配合煤质量和焦炭质量的预测模型都是针对某一地区煤、某一焦化企业适用,不具有通用性。对于本文研究对象的焦化厂,直接使用这些模型并不适合,根据本厂多年积累的生产数据和配煤炼焦经验,建立自己的配合煤粘结指数模型和焦炭强度预测模型,并把它们应用在配煤比优化计算中,是本论文研究的重点。论文的主要内容如下:
     用各单种煤的粘结指数G与配比的加和来估计配合煤G值偏差很大,配合煤的粘结指数不满足加和性。论文除了使用粘结指数G指标外,还引入了单种煤的Vdaf指标。用高斯函数分别对焦煤的Vdaf和瘦煤的Vdaf为进行了非线性变换,与配合煤G的加和值一起构成了预测配合煤G模型的3个组成部分。对于增加了Vdaf指标的非线性预测模型中的7个参数采用思维进化算法来寻找最优解,确定配合煤G的回归模型。该模型的预测效果明显提高,相对误差不超过±6%。
     论文采用了核主成分分析方法(KPCA),在累计贡献率达到90%时,把5项配合煤煤质检测指标和3项备煤炼焦工艺参数指标特征提取为2个主成分,以这2个主成分作为支持向量机的输入变量,分别建立了焦炭机械强度M25和M10预测模型。并与主成分分析方法(PCA)方法的提取特征效果和建模效果进行了比较。结果表明基于KPCA-SVR焦炭强度预测模型的降维效果明显,推广‘性能好,预测误差小,满足实际生产需要。
     使用支持向量回归机(SVR)建立焦炭强度预测模型过程中,对训练样本进行筛选,提出了一种基于粒度的支持向量回归机样本选择策略。对训练样本集在核空间不同的粒层上进行预处理,在粗粒度层次上,排除异常样本数据(噪声数据);在细粒度层次上,根据粒度的稠密不同,采用不同的样本约减策略。在合适的粒度范围内,以小规模样本子集来表征整体训练集的分布,降低学习的代价,同时获得了较好的焦炭机械强度回归模型,提高了SVR算法的效率。
     SVR建立焦炭机械强度模型过程中,对于ε-SVR及核函数的参数确定,采用基于思维进化(MEA)的优化方法,该方法与网格搜索方法比较,MEA方法的寻优时间大量减少,而优化结果却近似。
     配煤比例的最终确定离不开炼焦试验。论文设计了20 kg铁桶试验,该配煤试验将铁桶置于工业焦炉中随炉成焦,铁桶内煤的成焦与工业焦炉成焦保持了相同的环境,该方法投资小、操作方便、劳动强度低、试验调节灵活。经检验当煤饼与铁桶之间间隙为23mm,堆密度为1.2t/m3,结焦时间在24-48小时范围内情况下,20 kg铁桶试验焦炭与工业焦炉焦炭的机械强度相关性强,两者关系满足一元线性回归方程,20 kg铁桶试验可较好地预测工业焦炉的焦炭强度。
     论文综合考虑配合煤的价格成本和焦化产品的收益,选择了配比优化模型的目标函数,允分体现了焦化(?)的效益最大。基于给定的焦炭质量指标和配合煤质量指标、煤场现有的各大类煤种质量参数和其价格及一定的备煤和炼焦条件,采用MEA算法优化配煤比,并把论文所建立的配合煤粘结指数预测模型和焦炭机械强度预测模型应用在配比计算过程中。与标准遗传算法(GA)方法比较, MEA方法的效率要高,20 kg铁桶试验验证MEA方法所优化的配煤比例科学合理。
The technology of two or more different coals in certain proportion blending coking is called Coal Blending Cokemaking. Because of the complexity of the composition, structure and nature of coal. man's knowledge of coal and it's coking have many limitations.The cur-rently available prediction models of blend coal quality and coke quality are suitable for a certain area coal or a certain coking enterprises, which are not common. For the researched coking plant of the paper, using these existing model doesn't fit. So according to the factory's production data and coking experience accumulated for many years, establishing blend caking index model and the coke mechanical strength model of it's own, and applying the models to optimizating coal blending proportion are studied in the paper. The main content of the paper is listed as follows:
     The caking index G of blend coal can be incorrectly predicted with the additive value of the every single coal G and its proportion, additivity of blend G is not satisfied. Despite the indicator of caking index G. the paper also introduces the Volatile Vdaf indicator. Nonlinear transform is made for the coking-coal Vdaf and the lean-coal Vdaf by Gaussian function, which together with the blend G additive value, formed three components of predicting the blend G. The paper uses the MEA algorithms to find the optimal solution of the 7 parameters of the previous nonlinear model. After establishing the regression predicting model, the predicted effects of the blend G model is improved obviously and the relative error is not more than±6%.
     When the accumulative contribution rate is more 90%, KPCA extracts the 2 principal components from 5 blend quality indicators and 3 coking technology indicators. These two principal components employed as the inputs of SVM, we respectively set up the coke mechanical strength prediction models of M25and M10. Be compare with the effect of feature extraction and modeling of PCA, the coke strength prediction models based on KPCA-SVR have drastic dimension reduction. good generalization performance, less forecasting error and can meet the requirement of actual production.
     During the establishment of the coke strength model by SVR, A samples selection method based on G-SVR (granularity support vector regression) is proposed for filtering training samples. We pretreat the sample sets in different granulose of the kernel space. Abnormal sample data (noise data) are excluded in the coarse-granularity level and partial dense repeated samples are removed in fine-granularity level. In the reasonable range of granularity, the distribution of the original samples substituted by a small sample subset is not changed, while the learning experience is reduced, the coke prediction model is good and the efficiency of SVR algorithm is improved.
     In the process of building the coke mechanical strength SVR model, the optimizing method based on MEA is applied to determine the parameters ofε-SVR and Kernel function. Be compare with the mesh scanning algorithm, the optimization time of MEA decreases greatly but the optimization results are similar.
     The coal blending proportion can not be set without the coking test. The paper designs a 20 kg iron drum experiment that put iron drum into the industrial coke oven for coking, the coal of iron drum and the coal of industry oven are burnt in the same environment. This iron drum test method has small investment, easy operation, less intensity of labour and flexible trial adjustment. Upon examination, when the space between the iron drum and the coal pie within the iron drum is 23mm, the bulk density is 1.2t/m3 and the burned time is in 24~48 hours, There is a strong correlation between the 20 kg iron drum coke mechanical strength and the industrial oven coke mechanical strength. The relationship of the two mechanical strengths satisfies the one variable linear regression equation. The 20 kg iron drum experiment can predict the industrial oven coke mechanical strength perfectly
     Considering the cost price of blend coal and the income from coking products, the paper chooses the objective function of the coal blending ratio optimization model, which fully embodies the biggest benefits of the coking plant. Setting coke quality indicators and blend quality indicators, knowing the major existing single coals, its quality parameters, prices, and the conditions of pretreating coal and coking, a MEA algorithm is adopted to optimize coal blending ratio. The prediction models of blend G and coke mechanical strength established in the paper are applied in the process of optimizing blend ratio. The MEA method is higher retrieval efficiency than GA method and the results of the 20 kg iron drum experiment show that the blend ratio optimized by MEA method is scientific and reasonable.
引文
[1]姚昭章.郑明东.炼焦学(第3版)[M].北京:冶金工业出版社,2005.
    [2]徐邦学.炼焦生产新工艺、新技术与焦炭质量分析测试实用手册[M].吉林:吉林音像出版社,2005.
    [3]史世庄,周尽晖.乔国强等.对于、湿法熄焦焦炭的催化性能研究[J].燃料与化工,2006,33(9):18-21.
    [4]Grigore M, Sakurovs R. French D, et al. Influence of mineral matter on coke reactivity with carbon dioxide[J]. ISIJ,2006,46(4):503-512.
    [5]贾瑞民,纪同森.煤的岩相分析在配煤炼焦中的应用[J].燃料与化工,2005,36(6):22-24.
    [6]高志芳,单晓云,朱书全等. 二元线性回归法预测焦炭强度的研究[J].选煤技术,2006,12(6):1-3.
    [7]谢海深,刘永新,吕庆等.焦炭质量预测模型[J].东北大学学报(自然科学版),2007,28(3):373-377.
    [8]张群.吴信慈,冯安祖等.宝钢焦炭质量预测模型Ⅱ焦炭质量预测模型的建立和应用[J].燃料化学学报,2002.30(4):300-304.
    [9]周洪,闵礼书,邹祥林.基于神经网络的特大型焦炉焦炭质量预测研究[J].系统仿真学报,2009,21(6):1543-1552.
    [10]刘俊,张学东,刘宏等.基于BP神经网络的焦炭质量预测[J].燃料与化工,2006,37(6):12-15.
    [11]单晓云,赵树果,刘永新.基于神经网络的焦炭质量预测模型[J].选煤技术,2005,4(2):1-4.
    [12]韩勇.Kohonen-BP网络在焦炭质量预测系统中的应用[D].北京科技大学,2008.
    [13]赵青.基于神经网络的焦炭质量预测模型研究[D].北京科技大学,2008.
    [14郭译楠,巩敦卫,程 健.基于分布式神经网络的焦炭质量预测模[J].中国矿业大学学报.2005,34(4):514-517.
    [15]姜德玉,张晓光.提高焦炭质量的若干措施[J].燃料与化工,2007,38(2):5-6.
    [16]Maharana Sulata, Biswas B., Ganguly, Adity. Artificial neural network prediction for Coke strength after reaction and data analysis. Proceedings of World Academy of Science, Engineering and Technology, v 70, p 652-656, September 2010.
    [17]Vlasov G.A., Bukreev A.V., Zhidko A.S. Computer-aided system for predicting the quality of coal charge and coke,Koks i Khimiya, n 1, p 35-39,2002.
    [18]R. A lvarez, M.A. Diez, C. Barriocanal. An approach to blast furnace coke quality prediction. Fuel 86 (2007) 2159-2166.
    [19]E. Diaz-Faes, C. Barriocanal, M.A. Di'ez. Applying TGA parameters in coke quality prediction models. J. Anal. Appl. Pyrolysis 79 (2007) 154-160.
    [20]Hara Y., Sakawa M., Sakurai Y. The assessment of coke quality With particular emphasis on sampling technique [J]. Blast Furnace Coke:Quality, Cause and Effect, Canada:Me Master University,1980:1-38.
    [21]Angeleri R.. Predicting coke strength after reaction of blend in the sole-heated oven[C]. 57th Ironmaking Conference Proceedings. Canada:Toronto,1998:1061-1073.
    [22]Valia H S..Predicting coke strength after reaction with CO2 from coal analyses at inland Steel company [J]. I&SM,1989(5):77-87.
    [23]Toshimitsu R, Ishiguro M. Development of control method of coke strength after CO2 reaction[C].53rd Ironmaking Conference Proceedings U. S.A., Chicago,1994:71-78.
    [24]I. F. Kurunov. The quality of coke and possibilities for reducing its consumption in blast-furnace smelting. Metallurgist, Vol.45, Nos.11-12,2001.
    [25]M.A. Di'ez, R. Alvarez, C. Barriocanal. Coal for metallurgical coke production: predictions of coke quality and future requirements for cokemaking. International Journal of Coal Geology 50 (2002) 389-412.
    [26]段杏敏.基于支持向量机的焦炭质量预测模型研究[D].北京科技大学,2007.
    [27]E. N. Stepanov, A. S. Moiseenko, N. A. Tarasov, ect. Predicting the quality of coke based on expert evaluation of blast-furnace performance. Metallurgist, Vol.46, Nos.01-02,2002.
    [28]Min WU, Michio NAKANO. A model-based expert control strategy using neural etworks for the coal blending process in an iron and steel plant. Expert System with Applications, Vol. 16, No.3, pp.271-281,1999.
    [29]阳春华,沈德耀,吴敏,刘健勤.焦炉配煤专家系统的定性定量综合设计方法[J].自动化学报,2000,26(2):226-232.
    [30]戴成武,刘宏,高光良.炼焦配煤优化系统中的试验设计[J].燃料与化工,2008,39(3):19-22.
    [31]郭一楠,王凌,谭德等.基于遗传算法和神经网络混合优化的配煤控制[J].中国矿 业大学学报,2002.31(5):404-406.
    [32]邓俊,赖旭芝.吴敏等.基于神经网络和模拟退火算法的配煤智能优化方法[J].冶金自动化.2007,27(3):19-23.
    [33]许俊,韩志伟.邹德余.蒙特卡洛优化法在炼焦配煤中的应用[J].燃料与化工,2005,32(1):13-15.
    [34]单晓云.高志芳,赵树果等.模糊聚类分析优化炼焦配煤的研究[J].煤炭科学技术,2005,33(6):68-71.
    [35]邓乃扬,田英杰.数据挖掘中的新方法—支持向量机.北京:科学出版,2004.
    [36]张学工译.统计学习理论的本质.北京:清华大学出版社,2000
    [37]V.N.Vapnik许建华,张学工译.统计学习理论[M].北京:电子工业出版社,2004.
    [38]Cristianini N, Taylor J S. An Introduction to Support Vector Machines and Other Kernel-based Learning Methods. England:Cambridge University Press,2000.4.
    [39]Lunts A.Brailovskiy V.Evaluation of attributes obtained in statistical decision rules. Engineering Cybenretics,1967.3(1):98-109.
    [40]范明,柴玉梅,咎红英等译.统计学习基础—教据挖掘、推理与预测.北京:电子工业出版社,2004.
    [41]Vapnik V,Chapelle O,Boundson. Error expectation for support vector machine. Advances in Large Margin Classifiers, Cambridge. M A:MIT Press,2000:5-26.
    [42]JaakkolaT,Haussler D.Probabilistic kernel regression models. Proceedings of the 7th Workshop on AI and Statistics, San Francisco,1999:26-34.
    [43]Wahba G,LinY,Zhang H.Generalized approximate cross-validation for support vector machines:Another way to look at margin-like quantities. Advances in Large Marge Classifiers. Cambridge, M A:MIT Press,2000:397-309.
    [44]SCH LKOPF B. SIMARDZ PY. Prior knowledge in support vector kernels[C]// Advances in Neural Information Processing Systems. Cambridge:MIT Press,1998:640-646.
    [45]吴涛.贺汉根,贺明科.基于插值的核函教构造[J].计算机学报,2003,26(8):990-996.
    [46]SHAWE T J, CRISTIANININ. Kernel methods for pattern analysis[M]. London: Cambridge University Press.2004.
    [47]TAKIMOTO E. WARMUTHM. Path kernels and multiplicative updates[J]. Journal of Machine Learning Research,2004.4(5):773-818.
    [48]SMITS G F, JORDAN EM. Improved SVM regression using mixtures of kernels[C]// Proceedings of the International Joint Conference on Neural Networks. Honolulu:Institute of Electrical and Electronics Engineers Inc.,2002,3:2785-2790.
    [49]LIU Jingxu, LI Jin, TAN Yuejin. An empirical assessment on the robustness of support vector regression with different kernels[C]//International Conference on Machine Learning and Cybernetics. Guangzhou:Institute of Electrical and Electronics Engineers Computer Society,2005:4289-4294.
    [50]朱燕飞,伍建平,李琦等MISO系统的混合核函数LS-SVM建模[J].控制与决策,2005,20(4):417-425.
    [51]VAPNIK V, GOLOWICH S, SMOLA A. Support vector method for function approximation, regression estimation, and signal processing[J]. Neural Information Processing Systems,1997,9:281-287.
    [52]STITSONM O, GAMMERMAN A, VAPNIK V. Support vector regression with ANOVA decomposition kernels[C]//Advances in Kernel Methods—Support Vector Learning. Cambridge:MIT Press,1999:285-292.
    [53]SMOLA A J, SCH LKOPF B. The connection between regularization operators and support vector kernels [J]. Neural Networks,1998,10:1445-1454.
    [54]GENTONM G. Classes of kernels for machine learning:a statistics perspective[J]. Journal of Machine Learning Research,2001,2:299-312.
    [55]ZHANG L,i ZHOU Weida, JIAO Licheng. Wavelet support vector machine[J]. IEEE Trans. on Systems, Man, and Cybernetics-PartB:Cybernetics,2004,34(1):34-39.
    [56]胡丹,肖建,车畅.尺度核支持向量机及在动态系统辨识中的应用[J].西南交通大学学报,2006,41(4):460-465.
    [57]LANCKRIET G R, CRISTIANINI N, BARTLETT P. Learning the kernel matrix with semi-definite programming[J]. Journal of Machine Learning Research,2004,5(1):27-72
    [58]QIU S B, LANE T. Multiple kernel learning for support vector regression[DB/OL]. (2005-12-10) [2007-12-10]. http://www.cs.unm.edu/-treport/tr/05-12/QiuLane.
    [59]Chen PW, Wang JY, Lee HM. Model selection of SVMs using GA approach[C]//Proc of 2004 IEEE International Joint Conference on Neural Networks. Piscataway, NJ:IEEE Press,2004:2035-2040.
    [60]Zheng CH,et al. Automatic parameters selection for SVM based on GA[C]//Proc of the 5th World Congresson Intelligent Control and Automation, Piscataway, NJ:IEEE Press,2004:1869-1872.
    [61]邵信光,杨慧中.陈刚.基于粒子群优化算法的支持向量机参数选择及其应用[.J].控制理论与应用,2007.23(5):740-743.
    [62]袁小芳,王耀南.基于混沌优化算法的支持向量机参数选取方法[J].控制与决策,2006,26(1):111-113.
    [63]杨庆新,安金龙.马振平等.基于最小二乘支持向量机和自适应模拟退火算法的电磁场逆问题全局优化方法[J].电工技术学报,2008,23(11):1-7.
    [64]姚全珠,田元.基于人工免疫的支持向量机模型选择算法[J].计算机工程,2008,34(15):223-225.
    [65]刘向东,洛斌,陈兆乾.支持向量机鼓优模型选择的研究[J].计算机研究与发展,2005,42(4):576-581.
    [66]Cristianini N, Kandola Jet al. On kernel target alignment. Proc. Neural Information Processing Systems. Cambridge, MA:MIT Press,2002.367-373.
    [67]Chapelle O,Vapnik V,Bousquet O,etal. Choosing multiple parameters for support vector machines[J]. Machine Learning.2002.46(1).131-159.
    [68]门昌骞,王文剑.基于(?)包括估计的核参数选择方法[J].计算机工程与设计,2006,27(11):1961-1963.
    [69]Scholkopf B, Smola A. Bartlett P. New support vector algorithms. Neural Computation, 2000,12:1207-1245.
    [70]Chang_C C, Lin C J. Training v-support vector regression:theory and algorithms. Neural Computation.2002.14(8):1959-1977.
    [71]Chen P H. Lin C J, Scholkopf B. A tutorial on v-suppor vector machines. Applied Stochastic Models in Business and Industry,2005,21(2):111-136.
    [72]Suykens J. Branbanter J, Lukas L et al. Weighted least squares support vector machines: robustness and spare approximation:Neuro computing,2002,48(1):85-105.
    [73]Inoue.T, Abe S. Fuzzy support vector machines for pattern classification. Proceedings of International Joint Conference on Neural Networks,2001,2:1449-1454.
    [74]Lin C F. Wang S D. Fuzzy support vector machines. IEEE:Transactions on Neural Networks,2002,13 (2):464-471.
    [75]Chew H G. Bogner R E. Lim C C. Dual v-support vector machine with error rate and training size biasing. Proceedings of 26th IEEE ICASSP 2001, Salt Lake,2001 (2):1269-1272.
    [76]Tax D, Duin R. Data domain description by support vector. Proceedings of ESANN.1999, 251-256.
    [77]Lee Y J, Mangasarian O. L. SSVM:A smooth support vector machines. Computational Optimization and Applications,2001,20(1):5-22.
    [78]Lee Y J, Mangasarian O. L. RSVM:Reduced support vector machines. Proceedings of the First SIAM International Conference on Data Mining,2001.
    [79]Lin K M,.Lin C J. A Study on reduced support vector machines. IEEE Transactions on Neural Networks,2003.14(6):1449-1459.
    [80]Zhang L, Zhou W D. Jiao L C. Wavelet support vector machine. IEEE Transactions on Systems, Man and Cybernetics, Part B.2004,34(1):34-39.
    [81]Cortes C, Vapnik V. Support-vector networks. Machine Learning,1995,20(3):273-297.
    [82]Osuna E, Freund R, Girosi F. An improved training algorithm for support vector machines. Neural Networks for Signal Processing Ⅶ-Proceedings of the 1997 IEEE Workshop, New York,1997:276-285.
    [83]Platt J. Sequential minimal optimization:A fast algorithm for training support vector machines. Microsoft Research[R]. MSR-TR-98-14.1998
    [84]Joachims T. Making large-scale SVM learning practical. Advances in Kernel Methods-Support Vector Learning. MIT Press.2000.
    [85]Chang C C, Lin C J. L1BSVM:A Library for support vector machines[OL]. http:// www.csie.ntu.edu. tw/-cjlin/lihsvm,2009.
    [86]李青,焦李成,周伟达.基于向量投影的支撑向量预选取[J].计算机学报,2005,28(2):145-152.
    [87]李红莲,王春花,袁保宗.针对大规模训练集的支持向量机的学习策略[J].计算机学报,2004,27(5):715-719
    [88]韩德强,韩崇昭,杨艺.基于k-最近邻的支持向量预选取方法[J].控制与决策,2009,24(4):494-497.
    [89]文益民,王耀南.基于训练集平行分割的集成学习算法研究[J].小型微型计算机系统,2009,30(5):908-911.
    [90]罗瑜,易文德,何大可等.大规模训练集的快速缩减[J].西南交通大学学报,2007,42(4):467-469.
    [91]姜文瀚,周晓飞,杨静宇等.核子类(?)包样本选择方法及其SVM应用[J].计算机工程,2008,34(16):212-214.
    [92]Bakir G H, Bottou L, Weston 3. Breaking SVM complexity with cross-training[G].//Advance in Neural Information Processing Systems. Cambridge:MIT Press,2005:81-88.
    [93]C. Leslie, E. Eskin, A. Cohen, etc. Mismatch String Kernels for Discriminative Protein Classification. Bioinformatics,2004,20(4):467-476.
    [94]M.PS.Brown. W.N.Grundy, D.Lin, etc. Knowledge-based analysis of microarray gene expression data by using support vector machines. Pro Natl Acad Sci USA,2000,97 (1):262-267.
    [95]Ana Carolina Lorena, Andr C. P L. E. de Carvalho. Protein cellular localization prediction with Support Vector Machines and Decision Trees. Computers in Biology and Medicine, 2007,37(2):331-342.
    [96]I. Guyon. J. Weston, S. Barnhill. V Vapnik. Gene selection for cancer classification using support vector machines. Machine Learning,2002.46(3):389-422.
    [97]Vikramjit Mitra, Chia-Jiu Wang, Satarupa Banerjee. Text classification:A least square support vector machine approach. Applied Soft Computing,7(3).2007.
    [98]Kyung-Soon Lee, Kyo Kageura. Virtual relevant documents in text categorization with support vector machines. Information Processing and Management:an International Journal, 43(4).2007.
    [99]Kwang-Kyu Seo. An application of one-class support vector machines in content-based image retrieval. Expert Systems with Applications:An International Journal,33(2),2007, 491-498.
    [100]Hao Helen Zhang, Jeongyoun Ahn, Xiaodong Lin, et al. Gene selection using support vector machines with non-convex penalty[J]. Bioinformatics,2006,22:88-95.
    [101]晏春,杜耀华,高青斌等.基于支持向量机的人类5'非翻译区剪接位点识别[J].生物物理学报,2005,21(4):284-288.
    [102]Yap CW, Xue Y, Li ZR, et al. Application of support vector machines to in silico prediction of cytochrome p450 enzyme substrates and inhibitors[J].Curr Top MedChem.2006, 6(15):1593-1607.
    [103]Congde Lu. Chunmei Zhang, Taiyi Zhang. Wei Zhang. Kernel based symmetrical principal component analysis for face classification. Neuro computing, Volume 70, Issues 4-6, January 2007, Pages 904-911.
    [104]Zhan-Li Sun, De-Shuang Huang. Yiu-Ming Cheun. Extracting nonlinear features for multispectral images by FCMC and KPCA. Digital Signal Processing, Volume 15, Issue 4, July 2005, Pages 331-346.
    [105]Jing Li, Xuelong Li, Dacheng Tao. KPCA for semantic object extraction in images. Pattern Recognition, Volume 41, Issue 10, October 2008, Pages 3244-3250.
    [106]Bertrand Thirion, Olivier Faugeras. Dynamical components analysis of MRI data through kernel PCA. NeuroImage, Volume 20, Issue 1, September 2003, Pages 34-49
    [107]Scholkpof B, Smola A J, Muller K R. Nonlinear component analysis as a kernel eigenvalue problem. Nerual Compute,1998,10:1299
    [108]史卫亚.大规模数据集下核方法的技术研究[D].复且大学博上学位论文,2008.
    [109]Sun Chengyi, Sun Yan, Xie Kerning. Mind-evolution-based machine learning:an efficient approach of evolution computation[C]. Proceedings. of the 3rd World Congress on Intelligent Control and Automation (WCICA2000),2000,6, pp118-121.
    [110]Wang Chuanlong, Xie Kerning. Convergence of a new evolutionary computation algorithm in continuous state space. International Journal of Computer Mathematics,2002, 79(1):27-37.
    [111]王川龙,孙承意.基于思维进化的MEBML算法的收敛性研究[J].计算机研究与发展,2000,37(7):838-842.
    [112]谢克明,邱玉霞.基于数列模型的思维进化算法收敛性分析.系统工程与电子技术,2007,29(2):308-311.
    [113]Keming Xie, Yuxia Qiu, Gang Xie. Convergence analysis of mind evolutionary algorithm based on functional analysis.5th IEEE International Conference on Cognitive Informatics (ICCI'06),707-710.
    [114]邱玉霞,谢克明.基于泛函分析的思维进化算法收敛性研究.计算机工程与应用,2006,22:36-38.
    [115]Zeng Jianchao, Zha Kai. An mind-evolution method for solving numerical optimization problems[C]. Proceedings of the 3rd World Congress on Intelligent Control and Automation (WCICA2000),2000.6, ppl26-128.
    [116]Zeng Jianchao, Zha Kai. Research of the dissimilation strategy for MEBML[C]. Proceedings of the 3rd World Congress on Intelligent Control and Automation (WCICA2000), 2000,6, pp.129-131.
    [117]Wang Junli, Sun Yan, Sun Chengyi. Comparison of Performance of Basic MEC and DC Niching GAs[C]. Proceedings of the 4th World Congress on Intelligent Control and Automation (WCICA2002),2002, pp
    [118]孙承意,龙志伟,王皖贞.EMC应用于图像分析中的综述[J].计算机工程与应用,2004,31(6):90-92
    [119]吕青,谢克明,杜水贵.基于思维进化算法的变压器局部放电源定位新方法[J].太原理工大学学报.2009,41(2):171-174.
    [120]郭刚,谢刚,谢克明. 一种新的基于MEA的自适应模糊控制器.太原理工大学学报,2004.35(3):247-250.
    [121]Qiu Yuxia, Xie Gang, Xie Kerning. Optimal fuzzy controller design based on mind evolutionary algorithm. Proc. of 10th International Symposium On Integrated Circuits, Devices & Systems, Singapore.2004:1124-1128.
    [122]Gaowei Yan, Gang Xie, Zehua Chen and Kerning Xie. Mind Evolutionary Algorithms Based on Knowledge[C]. Proceedings of the 7th World Congress on Intelligent Control and Automation. Chongqing. China, June 25-27,2008, pp:5298-5303.
    [123]Xie Gang. Gao Jinlan. Xie Kerning. Fuzzy modeling based on rough sets and mind evolutionary algorithm, Proc. of the Sixth International Conference on Electronic Measurement and Instruments. Taiyuan, China,2003:109-112.
    [124]谢刚,张晶,李婷.基于实数编码免疫思维进化算法的PID参数整定[J].太原理工大学学报,2008,39(4):366-369.
    [125]Liu Jianxia, Wang Fang. Xie Kerning Applicatio of Improved Mind Evolutionary Algorithm in Wideband Impendence Transformer Design[C]. Proc. Of The 4th International Conference on Natural Computation (ICNC'08) and The 5th International Conference on Fuzzy Systems and Knowledge Discovery (FSKT'08), Jinan, China, October 18-20,2008:428-432.
    [126]王芳,谢克明,刘建霞.基于群体智能的思维进化算法设计[J].控制与决策,2010,25(1):145-148.
    [127]Xie Kerning. Du yonggui, Sun Chengyi. Application of the mind-evolution-based machine learning in mixture-ratio calculation of raw materials cement.//Proceedings.of the 3'rd World Congress on Intelligent Control and Automation (WCICA2000),132-134. June2000.
    [128]戴财胜.煤中挥发分线性可加性的实验研究[J].煤化工,2005,118(3):42-44.
    [129]赵克俭.胶质层最大厚度与粘结指数的线性回归方程及其应用[J].矿业快报, 2008,469(5):86-89.
    [130]钱纳新,杨建旗.古交地区煤的粘结指数与灰分的关系[J].西山科技,2001,4:15-17.
    [131]刘敏.配合煤胶质层的最大厚度和粘结指数相关性分析[J].天津冶金,2005,127(3):49-52.
    [132]张京良.台头区煤挥发分、胶质层厚度与粘结指数的关系[J].煤质技术,2001,4:30-31
    [133]万建华,王永贵.孙德忠等.烟煤粘结指数相互关系初探[J].煤炭技术,2000,19(2):32-34
    [134]刘仙平,陈桃花.用煤的挥发分、胶质层厚度计算煤的粘结指数[J].中国煤田地质,2002,14(2):20-22.
    [135]张文成.焦炭热强度影响因素的研究[J].梅山科技,2007,38(3):37-40.
    [136]何晓群.多元统计分析[M].北京:中国人民大学出版社,2009.
    [137]王国胤,张清华,胡军.粒计算研究综述[J].智能系统学报,2007,2(6):8-26.
    [138]YAO Y Y. Three perspectives of granular computing[J]. Journal of Nan chang Institute of Technology,2006.25(2):16-21.
    [139]张鑫,王文剑. 一种基于粒度的支持向量机学习策略[J].计算机科学,2008,35(8):101-103.
    [140]王桂芝,玉广亮.改进的快速DBSCAN算法[J].计算机应用,2009,9(9):2505-2508.
    [141]胡彩平,秦小麟.一种改进的基于密度的抽样聚类算法[J].中国图象图形学报,2007,12(11):2031-2036.
    [142]荣秋生,颜君彪,郭国强.基于DBSCAN聚类算法的研究与实现[J].计算机应用,2004,24(4):45-46.
    [143]李明华,刘全,刘忠等.数据挖掘中聚类算法的新发展[J].计算机应用研究,2008,25(1):13-17.
    [144]马帅,王腾蛟,唐世渭等. 一种基于参考点和密度的快速聚类算法[J].软件学报,2003,14(6):1089-1095.
    [145]Keerthi K. Lin C J. Asymptotic behaviors of support vector machines with Gaussian kernel[J]. Neural Computation,2003,153(3):1667-1689.
    [146]Lin H T, Lin C J. A study on sigmoid kernels for SVM and the training of non-PSD Kernels by SMO-type methods [EB]. March 2003. http://www.csie.ntu.edu.tw/-cjlin /papers, html.
    [147]Wang W J. XU Z B. A heuristic training for support vector regression[J]. Neurocomputing,2004,61:259-275.
    [148]谢振安,试验焦炉的评述[J], 燃料与化工,2004(04):4-6。
    [149]田博、叶海斌,10kg转鼓替代50kg转鼓的炼焦试验[J],燃料与化工,2007(01):20-21,

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700