基于集成神经网络的便携式空气质量监测电子鼻系统性能的提升
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
如果说有一种现代化进程自它诞生以来便成功地改变了人类的生活条件,那么它必然是工业化进程。但是,正如所有的进程化一样,工业化也带来许多负面的影响。环境污染,特别是室内外空气污染,便是其带来的负面影响之一。因为空气质量,尤其是我们长期处于的室内空气质量,可能会引发人们严重的健康问题。因此,人们对用于空气质量实时检测,灵巧的传感器系统展现了越来越浓厚的兴趣。电子鼻系统被认为是现有技术如:由人组成的专家组,基于气相色谱和质变的分析方法等的较好的替换方法。此外,电子鼻系统可以由低成本的现有的金属氧化物半导体传感器构成。但是,此类传感器容易出现漂移,并且易受非分析的目标气体的干扰。如果没有漂移补偿和基于模式识别算法的干扰消除或抑制,这些特性可能会降低电子鼻的性能。因此,鲁棒信号处理算法将这些因素视为电子鼻中最重要的部分。设计出这样的信号处理算法是本论文的主要目标。
     虽然电子鼻系统的概念已经出现几十年了,但直到现在,许多潜在用户对电子鼻系统依然所知甚少。因此,论文首先介绍了电子鼻系统的发展历史、一些关键概念和架构(包括采样及传输系统、传感器阵列、信号预处理、特征提取和模式识别等)及其应用。列举了一些当前可用的商用电子鼻系统。最后,指出了关于电子鼻当前的发展方向和存在的一些问题。
     电子鼻系统的校正需要一些原始数据,这些数据是通过在控制气体环境下的一些实验得到的。论文介绍了自制的电子鼻系统和产生原始数据库的实验装置和实验流程。这些数据构成了数据预处理和模式识别的基础。
     正交信号校正(OSC)是一种被成功应用于电子鼻系统的信号预处理方法。为了研究OSC的效果,论文研究了基于两个不同的多变量回归模型、多层感知器(MLP),偏最小二乘法(PLS)。通过6种室内空气污染物的实验结果表明,综合OSC和MLP的方法仅在有非常强的背景噪声下效果明显;而结合OSC和PLS在任何强度下的背景噪声下都能取得非常好的效果。但在需要非线性识别的情况下,MLP的效果比PLS更好。
     人工神经网络(ANN)和支持向量机(SVM)是广泛应用于电子鼻系统中的模式识别算法。人工神经网络是基于经验风险最小化的方法;而支持向量机则是在统计学习理论框架下基于结构风险最小化的方法。本文详细分析了MLP的人工神经网络和支持向量机的基本原理。考虑到遗传算法因为具备全局搜索能力,论文采用遗传算法分别优化MLP的初始权值和支持向量机的超参数。通过使用5种室内空气污染物数据库的实验结果表明,虽然MLP和支持向量机模型都可以提供令人满意的结果,但支持向量机具有更好的泛化性,这和理论分析的结果一致。但是,从嵌入式系统的应用角度考虑,MLP模型比支持向量机模型的计算复杂度更小。。
     有许多方法可以提升MLP神经网络的泛化性能,包括规范化,交叉验证,训练样本加噪声,以及集成神经网络方法等。其中集成神经网络是论文的研究重点。集成方法的成功可以从基于偏差和方差的分解进行解释,集成方法可以减少方差及偏离。集成学习参考了通过传统机器学习方法产生多重基础模型的技术,然后将其综合到一个集成模型中。生成阶段的目标是创造出在预测方面不同的精确的基础模型。这可以通过三类方法完全:基于学习集的修正的方法(如:bagging,boosting),基于训练算法修正的方法(如:负相关学习),和基于选择的方法(如:ambiguity based method, GASEN)。因为在组合阶段,线性方法(如:简单平均,加权和),非线性方法(中位数规则,“stacked generalization”)被广泛使用。另一个重要的可以替换创建基础模型集合的方法是专家组的混合,但是此方法并不在本文的研究范围以内。
     由于bagging和boosting法具有好的实验效果和理论支持,因而被广泛应用于集成学习算法。另外,bagging法在不稳定估计中被发现有比支持向量机、人工神经网络等更好的效果。本文提出了一种新的基于集成方法选择的方法。此方法综合了用来多样性度量的差异度量方差膨胀因子(VIF)方法,性能测量(均方差或者预测的平均绝对相对误差)方法和遗传算法从打包的神经网络池(pool)中挑选最优数量的基本模型。实验结果表明,本文提出的方法仅在非常少的情况比类似的方法效果差;另外,此方法的效果比最优的基本网络和标准打包方法(standardbagging methods)要好,更多关于VIF规则的研究可望显著提高此方法的性能。
     长期、短期漂移是气体传感器面临的最严重问题之一。如果没有进行抗漂移处理,会对电子鼻的性能造成严重的影响。本文提出了一种可以处理漂移的集成方法用于在线气体浓度估计,实验结果表明此方法不仅效果好,而且在与其它集成方法的比对中也显得很有吸引力。
If there is a modernization process that has successfully changed human livingconditions since its inception, it is undoubtedly the industrialization process. However,like any process, industrialization has many negative aspects. Environmental pollution,more specifically the outdoor as well as the indoor air pollution is one of these negativeaspects. Consequently, there has been a resurgence of interest in the development ofsmart sensing systems for real-time air quality monitoring, especially in indoorenvironments where we usually spend most of our time and from which serioushealth-related problems may emerge. Electronic nose (E-nose) systems are found asgood alternative to existing techniques such as the use of human panel, analyticalmethods based on gas chromatography or mass spectrometry, to name a few. Moreover,these systems can be constructed using cost-effective off-the-shelf metal oxidesemiconductor gas sensors. However, these sensors are prone to drift and interferencefrom non-target gas analytes, which may jeopardize the performance of E-nose systemsif drift counteraction and interference removal are not accounted by the patternrecognition algorithm. Therefore, robust signal processing algorithms that considerthese factors are of paramount importance in E-nose systems. Designing such signalprocessing algorithms is the main objective of this thesis work.
     Although E-nose systems have been around for decades, up to now, many potentialusers do not know about these systems. Therefore, the history, key concepts andarchitecture (this includes sampling and delivery systems, sensor array, signalpreprocessing, feature extraction, and pattern recognition) of E-nose systems are firstdiscussed, and then some of their applications are described. Furthermore, someavailable commercial E-nose systems are enumerated. Finally, current developmentsand problems associated with E-nose systems are pointed out.
     Calibration of E-nose systems requires some initial data sets that are mostlygenerated through several experiments under controlled atmospheric conditions. Aself-made E-nose system is first introduced, and then the experimental setup andprocedure to generate initial data sets are described. These data sets constitute a basisfor data preprocessing and pattern recognition.
     Orthogonal signal processing (OSC) is a preprocessing technique that has beensuccessfully applied in electronic nose systems. To investigate the effectiveness of OSC, an empirical study using two different multivariate regression methods, multilayerperceptron (MLP) and partial least squares (PLS), was carried out. Experimental resultsusing data sets of six indoor air pollutants show that, combination of OSC and MLP iseffective only in the presence of very strong background noise, whereas thecombination of OSC and PLS is very effective regardless of the level of backgroundnoise. However, the performance of MLP was better than that of PLS, which implies theneed for nonlinear pattern recognition methods.
     Artificial neural networks (ANNs) and support vector machines (SVMs) arepattern recognition methods that are widely used in E-nose systems. ANNs are based onempirical risk minimization; while SVMs are grounded in the framework of statisticallearning theory which is based on structural risk minimization. The basic principle ofANNs (with emphasis on MLP) and SVMs was thoroughly discussed. Owing to itsglobal search capability, genetic algorithm was used to optimize the initial weights andthe hyper-parameters of MLP and SVM, respectively. Experimental results using datasets of five indoor air pollutants show that, although both MLP and SVM modelsprovide satisfactory results, the latter have better generalization performance, which isin line with the theoretical assumption. However, for embedded applications, MLPmodels involve less computational complexity than SVM models. This is the rationalebehind stressing on MLP models, at the cost of improving their generalizationperformance.
     There are many methods to improve the generalization performance of MLP neuralnetworks. These include regularization, cross-validation, training with jitter (noise), andensemble method. The latter is the focus of this thesis work. The success of ensemblemethod can be explained based on the bias-variance decomposition of error, whichshows that ensemble method can reduce variance and also bias. Ensemble learningrefers to techniques which generate multiple base models using traditional machinelearning algorithms and combine them into an ensemble model. In the generation stagethe objective is to create base models that are sufficiently accurate and diverse in theirpredictions. This can be done through three categories of methods: methods based onthe modification of the learning set (e.g. bagging, boosting), methods based on themodification of the training algorithm (e.g. Negative Correlation Learning), andmethods based on selection (e.g. ambiguity based method, GASEN). As in thecombination stage, linear methods (e.g. simple averaging, weighted sum), and nonlinearmethods (median rule,“stacked generalization”) are commonly used. Another important alternative for creating an ensemble of base models is the mixture of experts, which isbeyond the scope of this thesis.
     Owing to their good empirical results and theoretical support, bagging andboosting are the most widely used ensemble learning algorithms. Moreover, bagging hasbeen found to be more effective on unstable estimators (predictors) such as supportvector machines, artificial neural networks, to name a few. A new selection basedensemble method is proposed and discussed. The method combines variance inflationfactor (VIF) as diversity metric, performance measure (either the mean squared error, orthe mean absolute relative error of prediction) and genetic algorithm to select optimumnumber of base networks (models) from a pool of bagged neural networks. Results fromtwo empirical studies show that the proposed method compares unfavorably with othersimilar methods in only few cases. Moreover, this method performs better than the bestbase network and the standard bagging method. More research on the rules regardingVIF will significantly improve the performance of this method.
     Long or short term drift is one of the most serious problems associated with gassensors. It can drastically affect the performance of electronic nose systems if nocounteraction is performed. In the last empirical study, an ensemble method that cancope with drift problem is proposed for online gas concentration estimation.Experimental results show that the method is not only effective but also attractive whencompared with other ensemble methods.
引文
[1] John C. Leffingwel. Olfaction [EB/OL]. Leffingwell Reports, Vol.2(No.1), May,2002.
    [2] B. S. Austin, S. M. Greenfield, B. R. Weir, G. E. Anderson, and J. V. Behar. Modeling theindoor environment [J]. Environ. Sci. Technol., vol.26,(1992), p.851–858.
    [3] S. K. Brown et al. Concentrations of Volatile Organic Compounds in Indoor Air-A Review [J].Indoor Air, Vol.4, No.2,(1994), p.123-134.
    [4] World Health Organization, Indoor Air Pollutants: Exposure and Health Effects [EB/OL].EURO Reports and Studies78,(1983).
    [5] E. P. Horvath. Building-related illness and sick building syndrome: from the specific to thevague [J].Cleveland Clin. J. Med.64(1997).
    [6] J. Nicolas, Surveillance de l’environnement: Methodologie [M]. Edition AcademiaBruylant-coll. Pedasup–2nd edn,1998-ISBN2-87209-352-4,1998.
    [7] Wilfred Bourgeois et al. The use of sensor arrays for environmental monitoring: interests andlimitations [J]. J. Environ. Monit.5,(2003), p.852–860.
    [8] Van Harreveld, A. P.; Heeres, P.; Harssema, H. A review of20years of standardization of odorconcentration measurement by dynamic olfactometry in Europe [J]. Journal of the Air&WasteManagement Association49(6):705–715,1999.
    [9] P. Gostelow, S. A. Parsons and R. M. Stuetz. Odour measurements for sewage treatmentworks [J]. Water Res.35(3),579–597,2001.
    [10] G. Preti, T. S. Gittleman, P. B. Staudte and P. Luitweiler. Letting the nose lead the way:Malodorous components in drinking water [J]. Anal. Chem.,1993,65,699A–702A.
    [11] D. D. Lee and D. S. Lee. Environmental gas sensors [J]. IEEE Sens. J.,2001,1(3),214–224.
    [12] D. M. Wilson, S. Hoyt, J. Janata, K. Booksh and L. Obando. Chemical sensors for portable,handheld field instruments [J]. IEEE Sens. J.,2001,1(4),256–274.
    [13] U. Wolff, F. L. Dickert, G. K. Fischerauer, W. Greibl and C. C. W. Ruppel. SAW sensors forharsh environments [J]. IEEE Sens. J.,2001,1(1),4–13.
    [14] A. Hierlemann, U. Weimar, G. Kraus, M. Schweizer-Berberich and W. Gopel. Polymer-basedsensor arrays and multicomponent analysis for the detection of hazardous organic vapours inthe environment [J]. Sens. Actuators, B,1995,26–27,126–134.
    [15] H. Nagle, R. Gutierrez-Osuna, and S. Schiffman. The how and why of electronicnoses[J].IEEE Spectrum, vol.35, no.9, pp.22–31,1998.
    [16] M. Stuiver, Biophysics of the sense of smell [D]. PhD thesis, University of Groningen,Netherlands,1958.
    [17] H. DeVries and M. Stuiver. The absolute sensitivity of the human sense of smell [J]. SensoryCommunication, vol.10, no.9, pp.159–167,1961.
    [18] G. Ohloff. Chemistry of odor stimuli [J]. Ex-Perienta,1986.
    [19] H. Zwaardemaker and F. Hogewind. On spray electricity for and waterfall-electricity [A]. Proc.Acad. Sci. Amst., vol.22, pp.429–437,1920.
    [20] Hartman, J.D. A possible method for the rapid estimation of flavors in vegetables [A]. Proc.Amer. Soc. Hort. Sci.1954,64,335-342.
    [21] Rafael Castro, Mrinal Kr. Mandal, Peter Ajemba, and Mujtaba A. Istihad. An electronic nosefor multimedia applications [J]. IEEE Transactions on Consumer Electronics, Vol.49, No.4,2003, pp.1431-1437.
    [22] Persaud, K.C.; Dodd, G. Analysis of discrimination mechanisms in the mammalian olfactorysystem using a model nose [J]. Nature1982,299,352-355.
    [23] Ikegami, A.; Kaneyasu, M. Olfactory detection using integrated sensors[C]. Proceedings of the3rd international conference on solid-state sensors and actuators, New York, NY, USA,1985;pp.136-139.
    [24] Gardner, J.W.; Bartlett, P.N. A brief history of electronic noses [J]. Sens. Actuat. B: Chem.1994,18,211-220.
    [25] M. Trincavelli, S. Coradeschi, and A. Loutfi. Odour classification system for continuousmonitoring applications [J]. Sensors and Actuators B: Chemical, vol.139, no.2, pp.265–273,2009.
    [26] Harper, W.J. Strengths and weaknesses of the electronic nose [M]. Kluwer Academic/PleniumPublishers, New York pp59-71,2001.
    [27] J.W. Gardner, P.N. Bartelet. Electronic nose: Principles and Applications, vol.1–4[M]. OxfordUniversity Press, Oxford,1999.
    [28] R. Gutierrez-Osuna, H. T. Nagle, and S. Schiffman, Transient response analysis of anelectronic nose using multi-exponential models [J]. Sens. Actuators B, vol.61, pp.170–182,1999.
    [29] Gardner J W, Hines E L and Tang H C, Detection of vapours&odours from a multisensorarray using pattern recognition techniques. Part2. Artificial neural networks [J]. Sens.Actuators B, vol.9, pp.9–15,1992.
    [30] J.W. Gardner, Detection of Vapours and Odours from a Multisensor Array Using PatternRecognition: Part1: Principal Component and Cluster Analysis [J]. Sensors and Actuators B,vol.4, pp.109-115,1991.
    [31] Schiffman S.S Nagle H.T Gardner J.W. Pearce. T.C. Handbook of Machine Olfaction-Electronic Nose Technology [M]. Wiley-VCH,2003.
    [32] A. Szczurek, M. Maciejewska, B. Flisowska-Wiercik, L. Bodzoj. The stop-flow mode ofoperation applied to a single chemiresistor [J]. Sensors and Actuators B, vol.148pp.522–530,2010.
    [33] Radu Ionescu and Eduard Llobet. Wavelet transform-based fast feature extraction fromtemperature modulated semiconductor gas sensors [J]. Sensors and Actuators B: Chemical,81(2-3):289–295,2002.
    [34] Satoshi Nakata, Sumiko Akakabe, Mie Nakasuji, and Kenichi Yoshikawa. Gas sensing basedon a nonlinear response: Discrimination between hydrocarbons and quantification ofindividual components in a gas mixture [J]. Analytical Chemistry,68(13):2067–2072,1996.
    [35] Marco Trincavelli and Silvia Coradeschi and Amy Loutfi. Odour classification system forcontinuous monitoring applications [J]. Sensors and Actuators B: Chemical,139(2):265–273,2009.
    [36] Zou Xiao-bo, Zhao Jiewen, Wu Shou-yi. The study of gas sensor array signal processing withnew genetic algorithms [J]. Sensors and Actuators B87:437–441,2002.
    [37] R. Gutierrez-Osuna. Pattern analysis for machine olfaction: a review [J]. Sensors Journal,IEEE,2(3):189–202,2002.
    [38] J. Lozano, J. P. Santos, T. Arroyo, M. Aznar, J. M. Cabellos, M. Gil, M. del Carmen Horrillo.Correlating e-nose responses to wine sensorial descriptors and gas chromatography–massspectrometry profiles using partial least squares regression analysis [J]. Sensors and ActuatorsB127(2007)267–276.
    [39] Jae Ho Sohn, Michael Atzeni, Les Zeller, Giovanni Pioggia. Characterisation of humiditydependence of a metal oxide semiconductor sensor array using partial least squares [J].Sensors and Actuators B131(2008)230–235.
    [40] G. Huyberechts. Simultaneous quantification of carbon monoxide and methane in humid airusing a sensor array and an artificial neural networks [J]. Sens. Actuators B45(1997)123–130.
    [41] B. Yea, T. Osaki, K. Sugahara, The concentration-estimation of inflammable gases with asemiconductor gas sensor utilizing neural networks and fuzzy inference [J]. Sens. Actuators B41(1997)121–129.
    [42] Z. Haoxian, M.O. Balaban, J.C. Principe, Improving pattern recognition of electronic nosedata with time-delay neural networks [J]. Sens. Actuators B: Chem.96(1–2)(2003)385–389.
    [43] A.K. Srivastava. Detection of volatile organic compounds (VOCs) using SnO2gas-sensorarray and artificial neural network [J]. Sens. Actuators B: Chem.96(1–2)(2003)24–37.
    [44] D. Gao, et al. Simultaneous estimation of classes and concentrations of odors by an electronicnose using combinative and modular multilayer perceptrons [J]. Sensors and Actuators B107(2005)773–781.
    [45] G. Daqi, C. Wei. Simultaneous estimation of odor classes and concentrations using anelectronic nose with function approximation model ensembles [J]. Sens. Actuators, B Chem.120(2007)584–594.
    [46] L. Zhang et al. Gases concentration estimation using heuristics and bio-inspired optimizationmodels for experimental chemical electronic nose [J]. Sens. Actuators, B Chem.160(2011)760–770.
    [47] Gao Daqi et al. Performance evaluation of multilayer perceptrons for discriminating andquantifying multiple kinds of odors with an electronic nose [J]. Neural Networks33(2012)204–215.
    [48] Gao Daqi et al., Quantitative analysis of multiple kinds of volatile organic compounds usinghierarchical models with an electronic nose [J]. Sensors and Actuators B161(2012)578–586.
    [49] Alphus D. Wilson, Manuela Baietto. Applications and Advances in Electronic-NoseTechnologies [J]. Sensors9,(2009)5099-5148.
    [50] Xiu-Ying Tian, et al. Rapid Classification of Hairtail Fish and Pork Freshness Using anElectronic Nose Based on the PCA Method [J]. Sensors2012,12,260-277.
    [51] Manuela O’Connell, et al. A practical approach for fish freshness determinations using aportable electronic nose [J]. Sensors and Actuators B80(2001)149–154.
    [52] ólalfsdóttir et al. Application of an Electronic Nose To Predict Total Volatile Bases in Capelin(Mallotus villosus) for Fishmeal Production [J]. J. Agric. Food Chem.2000,48,2353-2359.
    [53] Berdagué, J.L.; Talou, T. Examples of applications for meat products of semiconductor gassensors [J]. Sci. Alim.1993,13,141-148.
    [54] Vestergaard JS, Martens M, Turkki P. Application of an electronic nose system for predictionof sensory quality changes of a meat product (pizza topping) during storage [J]. Food Scienceand Technology,2007,40(6):1095-1101.
    [55] Vernat-Rossi, V.; Vernat, G.; Berdagué, J.L. Discrimination of agroalimentary products by gassensors with semiconductors functioning with ambient air of the laboratory. Variousapproaches of signal treatment [J]. Analysis1996,24,309-315.
    [56] Lakshmi P. Pathange, Parameswarakumar Mallikarjunan, Richard P. Marini, Sean O., andDavid Vaughan. Non-destructive evaluation of apple maturity using an electronic nose system[J]. Journal of Food Engineering,77(4)2006,1018–1023.
    [57] Echeverria, G.; Graell, J.; Lopez, M.L.; Brezmes, J.; Correig, X. Volatile production in "Fuji"apples stored under different atmospheres measured by headspace/gas chromatography andelectronic nose [J]. Acta Hort.2005,682,1465-1470.
    [58] Costa, G.; Noferini, M.; Montefiori, M.; Brigati, S. Non-destructive assessment methods ofkiwifruit quality [J]. Acta. Hort.2003,610,179-189.
    [59] P. E Keller. Electronic noses and their applications [C]. IEEE Technical ApplicationsConference and Workshops Northcon95, page116,1995.
    [60] Marco Trincavelli, Silvia Coradeschi, Amy Loutfi, Bo S derqu st, and Per Thunberg, Directidentification of bacteria in blood culture samples using an electronic nose. BiomedicalEngineering [J]. IEEE Transactions on,57(12):2884–2890,2010.
    [61] T Dewettinck, K Van Hege, and W Verstraete. The electronic nose as a rapid sensor forvolatile compounds in treated domestic wastewater [J]. Water Research,35(10):2475–2483,2001.
    [62] R.E Baby and M Cabezas and E.N Wals e de Reca. Electronic nose: a useful tool formonitoring environmental contamination [J]. Sensors and Actuators B: Chemical,69(3)214–218,2000.
    [63] Edward J. Staples, The First Quantitatively Validated Electronic Nose for EnvironmentalTesting of Air, Water, and Soil [J]. ACS National, March26-30,2000.
    [64] S. Zampolli et al., An electronic nose based on solid state sensor arrays for low-cost indoor airquality monitoring applications [J]. Sensors and Actuators B101(2004)39–46.
    [65] Sironi, S.; Capelli, L.; Centola, P.; Del Rosso, R.; Grande, M., II. Continuous monitoring ofodours from a composting plant using electronic noses [J]. Waste Manag.2007,27,389–397.
    [66] Micone, P.G.; Guy, C. Odour quantification by a sensor array: An application to landfill gasodours from two different municipal waste treatment works [J]. Sens. Actuators B Chem.2007,120,628–637.
    [67] Licinia Dentoni, Laura Capelli, Selena Sironi, Renato Del Rosso, Sonia Zanetti and MatteoDella Torre, Development of an Electronic Nose for Environmental Odour Monitoring[J].Sensors2012,12,14363-14381.
    [68] Frank R ck, Nicolae Barsan, and Udo Weimar, Electronic Nose: Current Status and FutureTrends [J].Chem. Rev.2008,108,705-725.
    [69] E. Comini, G. Faglia, and G. Sberveglieri, Solid State Gas Sensing [M]. Springer, New York,NY, USA,2008.
    [70] M. Padilla, A. Perera, I. Montoliu, A. Chaudry, K. Persaud, and S.Marco. Drift compensationof gas sensor array data by Orthogonal Signal Correction [J]. Chemometrics and IntelligentLaboratory Systems, vol.100, no.1, pp.28–35,2010.
    [71] S. Di Carlo, M. Falasconi, E. Sanchez, A. Scionti, G. Squillero, and A. Tonda. Increasingpattern recognition accuracy for chemical sensing by evolutionary based drift compensation[J]. Pattern Recognition Letters, vol.32, no.13, pp.1594–1603,2011.
    [72] O. Tomic, T. Ekl v, K. Kvaal, and J. E. Haugen. Recalibration of a gas-sensor array systemrelated to sensor replacement [J]. Analytica Chimica Acta, vol.512, no.2, pp.199–206,2004.
    [73] Cometto-Mu iz JE, Cain WS, Abraham MH. Nasal pungency and odor of homologousaldehydes and carboxylic acids [J]. Exp Brain Res.;118:180–188,1998.
    [74] Dionne B.C., Rounbehler D.P., Achter E.K., Hobbs J.R., Fine D.H., Vapor pressure ofexplosives [J]. J. Energ. Mater.,4,447–472,1986.
    [75] Buser, H.-R. J. Identification of2,4,6-trichloroanisol as a potent compound causing cork taintin wine [J]. J Agric. Food Chem.30:359-362,1982.
    [76] Rapp A. Volatile flavor of wine; Correlation between instrumental analysis and sensoryperception [J]. J.Nahrung,42(6):351-363,1998.
    [77] C. Zanchettin, T.B. Ludermir. Wavelet filter for noise reduction and signal compression in anartificial nose [J]. Applied Soft Computing7(2007)246–256.
    [78] Silvano Dragonieri, et al. An electronic nose in the discrimination of patients with non-smallcell lung cancer and COPD [J]. Lung Cancer64,166–170,2009.
    [79] M. Zuppa et al. Drift counteraction with multiple self-organising maps for an electronic nose[J]. Sensors and Actuators B98(2004)305–317.
    [80] Yong Yin, Huichun Yu, Hongshun Zhang. A feature extraction method based on waveletpacket analysis for discrimination of Chinese vinegars using a gas sensors array [J]. Sensorsand Actuators B134(2008)1005–1009.
    [81] E. Llobet, J. Brezmes, R. Ionescu, X. Vilanova, S. Al-Khalifa, J.W. Gardner, N. Barsan, X.Correig. Wavelet transform and fuzzy ARTMAP based pattern recognition for fast gasidentification using a micro-hotplate gas sensor [J]. Sensors and Actuators B,83,(2002)238-244.
    [82] C. Distante et al. On the study of feature extraction methods for an electronic nose [J]. Sensorsand Actuators B87(2002)274–288.
    [83] Wold, S., Antti, H., Lindgren, F., Ohman, J.,1998. Orthogonal signal correction ofnear-infrared spectra [J]. Chemometrics and Intelligent Laboratory Systems,44:175-185.
    [84] Sj blom, J., Svensson, O., Josefson, M., Kullberg, H., Wold, S.,1998. An evaluation oforthogonal signal correction applied to calibration transfer of near infrared spectra [J].Chemometrics and Intelligent Laboratory Systems,44:229-244.
    [85] QU Hai-bin, OU Dan-lin, CHENG Yi-yu. Background correction in near-infrared spectra ofplant extracts by orthogonal signal correction [J]. J. Zhejiang Univ. SCI20056B(8):838-843.
    [86] FEARN, T., On orthogonal signal correction. Chemometrics and Intelligent LaboratorySystems,50,47-52,2000.
    [87] FEUDALE, R. N., TAN, H.&BROWN, S. D. Piecewise orthogonal signal correction.[J].Chemometrics and Intelligent Laboratory Systems,63,129-138,2002.
    [88] WESTERHUIS, J. A., DE JONG, S.&SMILDE, A. K., Direct orthogonal signal correction.[J]. Chemometrics and Intelligent Laboratory Systems,56,13-25,2001.
    [89] K. Hornik, M. Stincombe, and H. White. Universal Approximation of an Unknown Mappingand its Derivatives using Multilayer Feedforward Networks [J]. Neural Networks3,pp.211-223,1990.
    [90] W. S. McCulloch and W. Pitts. A logical calculus of the ideas immanent in nervous activity [J].Bulletin of Mathematical Biophysics,5:115-133,1943.
    [91] F. Rosenblatt. The perceptron: A probabilistic model for information storage and organisationin the brain [J]. Psychological Review,65:386-408,1958.
    [92] M. L. Minsky and S. A. Papert. Perceptrons [J]. MIT Press, Cambridge, MA, US,1969.
    [93] S. Grossberg. Adaptive pattern classification and universal recording:1. Parallel developmentand coding of neural detectors [J]. Biological Cybernetics,23:121-134,1976.
    [94] S. Grossberg. Adaptive pattern classification and universal recording:2. feedback, expectation,olfaction, illusions [J]. Biological Cybernetics,23:187-202,1976.
    [95] T. Kohonen. Self-organised formation of topologically correct feature maps [J]. BiologicalCybernetics,43:59-69,1982.
    [96] T. Kohonen. Clustering, taxonomy, and topological maps of patterns [C]. In Proceedings ofthe6th International Conference on Pattern Recognition, pages114-128, Munich, Germany,1982.
    [97] D. E. Rumelhart, G. E. Hinton, and R. J. Williams. Learning internal representations by errorpropagation [M]. Parallel Distributed Processing: Explorations in the Microstructure ofCognition, volume1, pages318-362. MIT Press, Cambridge, MA, US,1986.
    [98] Broomhead D.S. and Lowe D. Multivariable Functional Interpolation and Adaptive Networks[J]. Complex Systems, v.2. pp.312-355,1988.
    [99] T. Nakamoto, K. Fukunishi, and T. Morizumi. Identification capability of odor sensor usingquartz-resonator array and neural network pattern recognition [J]. Sensors and Actuators, B1,pages473-476,1990.
    [100] H. Sundgren, F. Winquist, I. Lukkari, and I. Lundstr m. Artificial neural networks and gassensor arrays: quantification of individual components in a gas mixture [J]. MeasurementScience and Technology,2:464-469,1991.
    [101] Basheer and Hajmeer. Artificial neural networks: fundamentals, computing, design, andapplication [J]. J MICROBIOL METH, vol.43, no.1, pp.3-31,2000.
    [102] Haykin, S., Neural Networks: A Comprehensive Foundation [M].2nd ed., New York: PrenticeHall,1998.
    [103] Holland, J. H. Adaptation in Natural and Artificial Systems: An Introductory Analysis withApplications to Biology, Control, and Artificial Intelligence [M]. Ann Arbor, MI: University ofMichigan Press,1975.
    [104] Goldberg, D. E.: Genetic Algorithms in Search, Optimization, and Machine Learning [M].Addison-Wesley Pub. Co.,1989.
    [105] Davis, L. Applying algorithms to epistatic domains[C]. In: Proc. Int. Joint Conf. on ArtificalIntelligence, pp.162–164,1985.
    [106] Goldberg, D. E. and Lingle, R. Alleles, loci, and the TSP[C]. In: Proc.1st Int. Conf. onGenetic Algorithms, pp.154–159,1985.
    [107] Oliver, J. M., Smith, D. J. and Holland, J. R. C. A study of permutation crossover operators onthe travelling salesman problem [C]. In: Proc.2nd Int. Conf. on Genetic Algorithms, pp.224–230,1987.
    [108] Hiroaki Kitano. Empirical Studies on the Speed of Convergence of Neural Network Trainingusing Genetic Algorithms [C]. In: Eighth National Conference on Artificial Intelligence, Vol.II, pp789-795, AAAI, MIT Press,1990.
    [109] U. Seiffert. Multiple layer perceptron training using genetic algorithms[C]. In Proceedings ofthe European Symposium on Artificial Neural Networks, ESANN’2001, pages159–164,2001.
    [110] D. Montana and L. Davis. Training Feedforward Neural Networks Using GeneticAlgorithms[C]. Proceedings of the International Joint Conference on Artificial Intelligence,1989.
    [111] H.Sudarsana Rao, Vaishali G. Ghorpade, A. Mukherjee. A genetic algorithm based backpropagation network for simulation of stress–strain response of ceramic-matrix-composites [J].Computers and Structures84(2006)330–339.
    [112] Yu-Tzu Chang et al. Optimization the Initial Weights of Artificial Neural Networks viaGenetic Algorithm Applied to Hip Bone Fracture Prediction [J]. Hindawi PublishingCorporation Advances in Fuzzy Systems, Volume2012(2012).
    [113] Subhra Rani Patra, R. Jehadeesan, S. Rajeswari, S. A.V. Satya Murty, M. Sai Baba.Development of Genetic Algorithm based Neural Network model for parameter estimation ofFast Breeder Reactor Subsystem [J]. International Journal of Soft Computing and Engineering(IJSCE) ISSN:2231-2307, Volume-2, Issue-4, September2012.
    [114] V. Vapnik. The Nature of Statistical Learning Theory [M]. Springer-Verlag,1995.
    [115] V. Vapnik, Statistical Learning Theory [M]. Wiley, New York,1998.
    [116] Vapnik and A. Tscherwonenkis. The necessary and sufficient conditions for consistency of themethod of empirical risk minimization [J]. Pattern Recognition and Image Analysis1:284-305,1991.
    [117] O. Gualdron,, J. Brezmes, E. Llobet, A. Amari, X. Vilanova, B. Bouchikhi, and X. Correig.Variable selection for support vector machine based multisensor systems [J]. Sensors andActuators B,122:259-268,2007.
    [118] Marla L. Frank, Matthew D. Fulkerson, Bruce R. Patton, and Prabir K. Dutta. TiO2-basedsensor arrays modeled with nonlinear regression analysis for simultaneously determining COand O2concentrations at high temperatures [J]. Sensors and Actuators B,87:471-479,2002.
    [119] O. Chapelle and V. Vapnik. Model selection for support vector machines [J].Advances inNeural Information Processing Systems12. Cambridge, Mass: MIT Press,2000.
    [120] O. Chapelle, V. Vapnik and Y. Bengio. Model Selection for Small Sample Regression [J].Machine Learning,48(1):9-13, Jul.2001.
    [121] O. Chapelle, V. Vapnik, O. Bousquet, and S. Mukherjee. Choosing multiple parameters forsupport vector machines [J]. Machine Learning,2002.
    [122] H. Fr hlich, O. Chapelle, B. Sch lkopf. Feature Selection for Support Vector Machines UsingGenetic Algorithms [J]. International Journal on Artificial Intelligence Tools13(4):791-800,2004.
    [123] J. A. K. Suykens and J. Vandewalle. Least squares support vector machine classifiers [J].Neural Processing Letters, Vol.9(3):293-300,1999.
    [124] J. A. K. Suykens. Nonlinear Modeling and Support Vector Machines [C].IEEEInstrumentation and Measurement Technology Conference, Budapest, Hungary.2001.
    [125] J. A. K. Suykens, V. T. Gestel, J. De Brabanter, B. De Moor, J. Vandewalle.[M]“LeastSquares Support Vector Machines”, World Scientific,2002.
    [126] J. A. K. Suykens, L. Lukas, and J. Vandewalle. Sparse least squares support vector machineclassifiers [C]. ESANN'2000European Symposium on Artificial Neural Networks, pp.37–42.2000.
    [127] J. A. K. Suykens, L. Lukas, and J. Vandewalle. Sparse approximation using least squaressupport vector machines [C]. IEEE International Symposium on Circuits and SystemsISCAS'2000,2000.
    [128] L. Hoegaerts, J.A.K. Suykens, J. Vandewalle and B. De Moor. A Comparison of PruningAlgorithms for Sparse Least Squares Support Vector Machines [C]. ICONIP2004:pp.1247-1253.
    [129] Mardia, K., Kent, J., and Bibby, J. Multivariate Analysis [M]. Academic Press, London,1979.
    [130] Helland, I. On the structure of partial least squares regression [J]. Communs. Statist. Simun,17:581-607,1988.
    [131] Garthwaite, P. H. An interpretation of partial least squares [J]. JASA,89(425):122-127,1994.
    [132] Geladi, P. and Kowalski, B. Partial least squares regression: A tutorial [J]. Analyt. Chim. Acta.185:1-17,1986.
    [133] Frank, I. and Friedman, J. A statistical view of some chemometrics regression tools[J].Technometrics,35(2):109-148,1993.
    [134] Brooks, R. and Stone, M. Joint continuum regression for multiple predictands [J]. JASA,89(428):1374-1377,1994.
    [135] Breiman, L and Friedman, J. Predicting multivariate responses in multiple linear regression[J].J.R.Statist.Soc.B (1997)59, No.1, pp.3-54.
    [136] Breiman, L. Bagging predictors [J]. Machine Learning24(2),123–140,1996a.
    [137] D eroski, S. and enko, B. Is combining classifiers with stacking better than selecting the bestone?[J]. Machine Learning54(3),255–273,2004.
    [138] Chandra, A. and Yao, X. Evolving hybrid ensembles of learning machines for bettergeneralization [J]. Neurocomputing69(7–9),686–700,2006.
    [139] Chawla, N. V., Hall, L. O., Bowyer, K. W. and Kegelmeyer, W. P. Learning ensembles frombites: A scalable and accurate approach [J]. Journal of Machine Learning Research5,421–451,2004.
    [140] Kuncheva, L. I. and Whitaker, C. J. Measures of diversity in classifier ensembles and theirrelationship with the ensemble accuracy [J].Machine Learning51(2),181–207,2003.
    [141] Valentini, G. and Dietterich, T. G. Bias-variance analysis of support vector machines for thedevelopment of SVM-based ensemble methods [J]. Journal of Machine Learning Research5,725–775,2004.
    [142] W.G. Baxt, Improving the accuracy of an artificial neural network using multiple differentlytrained networks [J]. Neural Computation4(1992)772–780.
    [143] A. Krogh, J. Vedelsby. Neural network ensembles, cross validation, and active learning [C]. In:Proc. NIPS,1994, pp.231-238.
    [144] P.W.Munro, B. Parmanto. Competition among networks improves committee performance [C].In: Proc. NIPS,1996, pp.592–598.
    [145] R. J. Mammone, Artificial Neural Networks for Speech and Vision [M]. Chapman&Hall,New York,1993, pp.126-142.
    [146] Hansen, Salamon: Neural network ensembles [J]. IEEE Trans. Pattern Analysis and MachineIntelligence, vol.12, no.10, pp.993-1001,1990.
    [147] S. Geman, E. Bienenstock, and R. Doursat. Neural networks and the bias-variance dilemma [J].Neural Computation,4(1):1–58,1992.
    [148] L. Breiman. Bias, variance and arcing classifiers [EB/OL]. Technical Report TR460, StatisticsDepartment, University of California, Berkeley, CA,1996b.
    [149] E. Kong and T. G. Dietterich. Error-correcting output coding correct bias and variance[C]. In:The XII International Conference on Machine Learning, pages313–321, San Francisco, CA,1995. Morgan Kauffman.
    [150] R. E. Schapire, Y. Freund, P. Bartlett, and W. Lee. Boosting the margin: A new explanationfor the effectiveness of voting methods [J]. The Annals of Statistics,26(5):1651–1686,1998.
    [151] P. Domingos. A Unified Bias-Variance Decomposition for Zero-One and Squared Loss[C]. InProceedings of the Seventeenth National Conference on Artificial Intelligence, pages564–569,Austin, TX,2000c. AAAI Press.
    [152] Hastie, Tibshirani, Friedman.The Elements of Statistical Learning [M]. Springer,2001.
    [153] P. S. de Laplace. Deuxième supplément à la théorie analytique des probabilités [M].1818.Reprinted (1847) in Oeuvres Complètes de Laplace, vol.7(Paris, Gauthier-Villars)531-580.
    [154] Dietterich, T.G. Machine learning research: Four current directions [J]. AI Magazine,18(4), pp.97-136,1997.
    [155] Niall Rooney, David Patterson, Sarab Anand, and Alexey Tsymbal, Dynamic integration ofregression models[C]. In: International Workshop on Multiple Classifier Systems.2004, vol.LNCS3181, pp.164-173, Springer.
    [156] Zhi-Hua Zhou, Jianxin Wu, and Wei Tang. Ensembling neural networks: many could be betterthan all [J]. Artificial Intelligence, vol.137, pp.239-263,2002.
    [157] Gonzalo Martínez-Mu oz and Alberto Suárez. Pruning in ordered bagging Ensembles[C], inInternational Conference on Machine Learning,2006, pp.609-616.
    [158] David W. Opitz, Jude W. Shavlik. Generating Accurate and Diverse Members of aNeural-Network Ensemble [J]. Advances in Neural Information Processing Systems8, pp.535-541,1995.
    [159] Jo o Mendes-Moreira, Carlos Soares, Alípio Mário Jorge, Jorge Freire de Sousa: Ensembleapproaches for regression: A survey [J]. ACM Comput. Surv.45(1):10(2012).
    [160] Ludmila I. Kuncheva. Switching between selection and fusion in combining classifiers: anexperiment [J]. IEEE Transactions on Systems, Man, and Cybernetics-Part B, vol.32, no.2,pp.146-156,2002.
    [161] Leo Breiman. Using iterated bagging to debias regressions [J.] Machine Learning, vol.45, no.3, pp.261-277,2001.
    [162] S.B. Kotsiantis and P.E. Pintelas. Selective averaging of regression models [J]. Annals ofMathematics, Computing&Teleinformatics, vol.1, no.3, pp.65-74,2005.
    [163] Michael LeBlanc and Robert Tibshirani. Combining estimates in regression and classification[J]. Journal of the American Statistical Association, vol.91, pp.1641-1650,1996.
    [164] Leo Breiman, Stacked regressions [J]. Machine Learning, vol.24, pp.49-64,1996.
    [165] Christopher J. Merz and Michael J. Pazzani. A principal components approach to combiningregression estimates [J]. Machine Learning, vol.36, pp.9-32,1999.
    [166] Naonori Ueda and Ryohei Nakano. Generalization error of ensemble estimators[C]. in IEEEInternational Conference on Neural Networks,1996, vol.1, pp.90-95.
    [167] Brown, G. Diversity in Neural Network Ensembles [D]. PhD thesis, University of Birmingham,Birmingham, UK,2004.
    [168] L.I. Kuncheva and R.K. Kountchev. Generating classifier outputs of fixed accuracy anddiversity [J]. Pattern Recognition Letters,(23):593-600,2002.
    [169] L. Breiman, Bagging predictors [J]. Machine Learning24(1996)123-140.
    [170] Bauer, E.,&Kohavi, R. An empirical comparison of voting classification algorithms: Bagging,Boosting, and Variants [J]. Machine Learning36,105-139,1999.
    [171] Rozita A. Dara and Mohamed S. Kamel. Sharing training patterns among multiple classifiers
    [C]. In:5th International Workshop, MCS2004, Cagliari, Italy, June9-11,2004, Proceedings,volume3077of Lecture Notes in Computer Science. Springer, pp-243-252.
    [172] Freund, Y., and R. E. Schapire. Experiments with a new boosting algorithm. In: Proceedingsof the thirteenth International Conference on Machine Learning [C], pp.148–156,1996.
    [173] Drucker, H. Improving Regressor using Boosting. Douglas H. Fisher, Jr (Eds.). Proc. of the14th Inl. Con/on Machine Learning, pp107-115,1997. Morgan Kaufmann.
    [174] Liu, Y.&Yao, X. Ensemble Learning via Negative Correlation [J]. Neural Networks12,1399-1404,1999.
    [175] Islam, M. M., Yao, X.,&Murase, K. A Constructive Algorithm for Training CooperativeNeural Network Ensembles [J]. IEEE Transactions on Neural Networks14,820-834,2003.
    [176] Bruce E. Rosen. Ensemble learning using decorrelated neural networks [J]. ConnectionsScience-Special Issue on Combining Artificial Neural Networks: Ensemble Approaches,8(3and4):373-384,1996.
    [177] Melville, P., Mooney, R. J. Constructing diverse classifier ensembles using artificial trainingexamples [C]. In: Proceedings of the18th International Joint Conference on ArtificialIntelligence, pp.505–510, Mexico,2003.
    [178] Zhi-Hua Zhou et al. Genetic Algorithm based Selective Neural Network Ensemble[C]. In:Proceedings of the17th International Joint Conference on Artificial Intelligence, Seattle, WA,2001, vol.2, pp.797-802.
    [179] Zhi-Hua Zhou et al. Ensembling Neural Networks: Many Could Be Better Than All[J].Artificial Intelligence137(1-2)(2002)239-263.
    [180] L. Kuncheva, C.Whitaker, C.Shipp, and R.Duin. Limits on the majority vote accuracy inclassifier fusion [J]. Pattern Analysis and Applications,6(1):22-31, April2003.
    [181] K. Tumer and J. Ghosh. Order statistics combiners for neural classifiers [J]. World Congresson Neural Networks, Vol. I:31-34, July1995.
    [182] W. B. Langdon and B. F. Buxton. Genetic programming for combining classifiers[C]. InProceedings of the Genetic and Evolutionary Computation Conference (GECCO-2001), pages66-73, San Francisco, California, USA,7-11July2001. Morgan Kaufmann.
    [183] John R. Koza. Genetic Programming: On the Programming of Computers by Means of NaturalSelection [M]. MIT Press,1992.
    [184] Xiao-Hua Zhou, Donna K. McClish, and Nancy A. Obuchowski. Statistical Methods inDiagnostic Medicine [M]. Wiley Europe, July2002. ISBN:0-471-34772-8.
    [185] Volker Tresp. A bayesian committee machine [J]. Neural Computation,12(11):2719-2741,2000.
    [186] M. Costa, E. Filippi, and E. Pasero. Artificial neural network ensembles: a bayesianstandpoint[C]. In: M. Marinaro and R. Tagliaferri, editors, Proceedings of the7th ItalianWorkshop on Neural Nets, pages39-57. World Scientific,1995.
    [187] E. Mandler and J. Schuermann. Pattern Recognition and Artificial Intelligence, chapterCombining the Classification Results of independent classifiers based on the Dempster/Shafertheory of evidence [M]. North Holland, Amsterdam, pages381-393,1988.
    [188] Sarunas Raudys and Fabio Roli. The behavior knowledge space fusion method: Analysis ofgeneralization error and strategies for performance improvement[C]. In: Proc.Int. Workshopon Multiple Classifier Systems (LNCS2709), pages55-64, Guildford, Surrey, June2003.Springer.
    [189] Wolpert, D. H. Stacked generalization [J]. Neural Networks,5,241-259,1992.
    [190] Leila Douha, Nabil Benoudjit, Farid Melgani. A robust regression approach forspectrophotometric signal analysis [J]. J. Chemometrics2012;26:400–405.
    [191] R. A. Jacobs, M. I. Jordan, S. J. Nowlan, and G. E. Hinton. Adaptive mixtures of local experts[J]. Neural Computation,3(1):79-87,1991.
    [192] R.A. Jacobs and M.A. Tanner. Combining Artificial Neural Nets, chapter Mixtures of X [M].Springer-Verlag, London,1999.
    [193] Michael I. Jordan and Robert A. Jacobs. Hierarchical mixtures of experts and the EMalgorithm [J]. Neural Computation,6:181-214,1994.
    [194] Seniha Esen Yuksel, Joseph N. Wilson, Paul D. Gader. Twenty Years of Mixture of Experts [J].IEEE Transactions on Neural Networks and Learning Systems, Vol.23, issue8, pp.1177-1193,2012.
    [195] X. Wang, P. Whigham, D. Deng and M. Purvis. Time-line hidden Markov experts for timeseries prediction [J]. Neural Inf. Process., Lett. Rev., vol.3, no.2, pp.39-48,2004.
    [196] L. Cao. Support vector machines experts for time series forecasting [J]. Neurocomputing, vol.51, pp.321-339,2003.
    [197] C. A. M. Lima, A. L. V. Coelho and F. J. Von Zuben. Hybridizing mixtures of experts withsupport vector machines: Investigation into nonlinear dynamic systems identification [J]. Inf.Sci., vol.177, no.10, pp.2049-2074,2007.
    [198] A. Bermak and D. Martinez. A Compact3D VLSI Classifier using Bagging ThresholdNetwork Ensembles[J], IEEE Transactions on Neural Networks, vol.14, no.5, pp.1097-1109,2003.
    [199] M. Shi, S. B-Belhouari and A. Bermak. Quantization errors in committee machine for gassensor application [C]. IEEE International Symposium on Circuits and Systems, ISCAS2005,vol.3, pp.1911-1914, Kobe, Japan,2005.
    [200] Shi M., Brahim-Belhouari S., Bermak A. and Martinez D. Committee machine for odordiscrimination in gas sensor array[C]. In: Proceedings of the11th International Symposium onolfaction and electronic nose (ISOEN), pp.74-76, Barcelona13-15April2005.
    [201] Vitor Hirayama, Francisco J. Ramirez-Fernandez, Walter J. Salcedo. Committee machine forLPG calorific power classification [J]. Sensors and Actuators B116(2006)62–65.
    [202] Evandro Bona, et al. Optimized Neural Network for Instant Coffee Classification through anElectronic Nose [J]. International Journal of Food Engineering: Vol.7: Iss.6, Article6,2011.
    [203] A. Vergara et al. Chemical gas sensor drift compensation using classifier ensembles [J].Sensors and Actuators B166–167(2012)320-329.
    [204] Amir Amini, Mohammad Ali Bagheri, Gholam Ali Montazer. Improving gas identificationaccuracy of a temperature-modulated gas sensor using an ensemble of classifiers [J], Sensorsand Actuators B xxx (2012) xxx–xxx,(in press).
    [205] Greene, W. H. Econometric Analysis,2nd Ed [M]. Macmillan, New York,1993.
    [206] Robert, M. O’Brien. A Caution Regarding Rules of Thumb for Variance Inflation Factors [J].Qual. Quant.41(2007)673–690.
    [207] Kutner, Nachtsheim, Neter. Applied Linear Regression Models, fourth ed. McGraw-Hill, Irwin,2004.
    [208] D.W. Opitz, J.W. Shavlik. Actively searching for an effective neural network ensemble [J].Connection Science8(3-4)(1996)337-353.
    [209] R.W. Kennard, L.A. Stone. Computer aided design of experiments [J]. Technometrics, Vol.11,1969, pp.137-148.
    [210] A. Sharkey and N. Sharkey. Combining diverse neural networks [J].The KnowledgeEngineering Review, Vol.12, No.3,1997, pp.231-247.
    [211] L. Didaci, G. Giacinto, F. Roli, G.L. Marcialis. A study on the performances of dynamicclassifier selection based on local accuracy estimation [J]. Pattern Recognition, Vol.38, No.11,2005, pp.2188-2191.
    [212] Woods, K. Combination of multiple classifiers using local accuracy estimates [J]. IEEETransaction on Pattern Analysis and Machine Intelligence, Vol.19, No.4,1997, pp.405-410.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700