复杂仿真数据的降维与可视化聚类方法研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
随着科学技术的发展,仿真系统的复杂程度越来越高,随之仿真数据也出现高维、数据量激增、包含随机性和人为性等不确定因素等特点,经典统计理论在分析这些数据时暴露出了一系列问题。随着计算机硬件技术的发展和数据挖掘理论的兴起,基于数据挖掘技术的复杂仿真数据分析逐渐进入了研究人员的视野,论文基于可视化数据挖掘技术,对大规模、高维数、相互关系复杂的仿真数据的可视化聚类及相关问题进行了研究,具有一定的理论和工程意义。
     针对专家估计法对复杂仿真数据可视化前的特征选择,可能造成忽视专家个人差异及数据自身特点的问题,提出了基于模糊综合评价模型的主客观估计法。首先构造专家模糊评判矩阵,并根据专家在行业的影响力确定权重,进行主观模糊综合评价;然后根据数据自身特点计算属性的信息熵,获得客观评价;最后将主观评价和客观评价按照不同比例进行综合,从而确定属性的重要程度。
     针对复杂仿真数据可视化前的数据降维问题,分析了常用的流形学习降维方法;证明了局部切空间排列算法(LTSA)与核主分分析方法(KPCA)本质上的一致性;提出了基于核的LTSA算法对增量仿真数据降维的改进。经实验验证,该改进算法与LTSA算法相比能达到同样的降维效果,并且具有更高的运行效率。
     针对复杂仿真数据降维中需事先提供维数的问题,采用改进的极大似然估计法进行本征维数估计。首先分析了极大似然法存在的缺点,提出利用测地线距离代替欧式距离的方法,来解决错误近邻点选择问题;提出对各局部估计的本征维数以密度修正代替平均值的方法,来解决估计结果受奇异值影响过大的问题。
     针对复杂仿真数据可视化聚类问题进行了研究,提出两种可视化聚类方法。在基于改进雷达图的可视化聚类方法研究中,首先对传统雷达图进行了改进,为突出数据特征,以属性权重确定极角,以属性值确定极径;又对k-means算法中存在的随机确定初始中心点而无法得到最优解问题,提出了优化初始中心点算法;针对算法必须事先给出聚类个数,而实际难以做到的问题,提出采用循环和专家监督干预的改进方法。在基于自组织映射的可视化聚类方法研究中,将传统的矩形或六角形方格中的神经元映射改变为雷达图映射,解决了传统SOM映射中无法反映数据点差距的问题;通过增加横向收缩力,重构权向量,加速了映射点的收敛时间;提出利用随获胜神经元到邻域神经元间距单调递减的函数作为修正值的自适应学习速度改进,来增加算法的稳定性和收敛时间。经实验验证,该算法具有更高的效率和鲁棒性。
     论文丰富了高维数据降维、可视化数据挖掘的方法,为复杂仿真数据分析方法提供了新的技术支持。
The simulation systems are more and more complicated with the development ofscience and technology. The simulation data also appears high-volume,high-dimension,and many uncertain characteristics such as randomness, artificiality and so on. The dataanalysis applying classical statistical theories reveals a series of problems. With thedevelopment of computer hardware technology and data mining, complex simulation dataanalysis based on data mining technology gradually enters the sight of researchers. Thispaper studies visual clustering and related issues of the simulation data with large-scale,high-dimension, and the complex relationship. It is based on visual data miningtechnology. The research has a certain theoretical and engineering value.
     It often has to select feature before the complex simulation data visualization. Theexpert personal differences and data characteristics may be ignored in the traditionalexpert estimate method, the subjective and objective estimation method based on fuzzycomprehensive evaluation model is proposed because. First, the expert fuzzy evaluationmatrix is constructed, and expert right weights are determined according to the expertsinfluence in the industry. It is the subjective fuzzy comprehensive evaluation. Then theattribute information entropy is calculated according to the data nature features. Finally,the subjective evaluation and objective information entropy are integrated in differentproportions, thus the degree of importance of the attributes are determined.
     One of the important issues is the dimension reduction for visualization data miningof complex simulation data. First the main manifold learning dimension reductionmethod are analyzed in detail; Then both local tangent space alignment algorithm (LTSA)and kernel principal component analysis (KPCA) are deduced and proved consistencyessentially from mathematics; Finally an improved LTSA algorithm based on kernel forthe incremental simulation data is proposed. The experiments confirm the improvedLTSA algorithm achieve the same effects for dimension reduction as the LTSA algorithm,and the former has a higher efficiency than the latter.
     The dimension reduction of complex simulation data needs give the intrinsic dimension in advance. Aiming at this problem, an improved maximum likelihoodestimation of the intrinsic dimension estimation is suggested in this paper. Theshortcomings of the maximum likelihood method are analyzed, geodesic distance is used,instead of Euclidean distance, to solve the nearest neighbor selection errors; In order toavoid influencing the estimation result too much by the singular value, the average ofevery local intrinsic dimension is replaced by density correction value.Two novel methods are proposed in the study of the visualization clustering of complexsimulation data. In the visualization clustering method based on the improved radar chart,the traditional radar chart is improved to highlight the characteristics of the data, whichattribute weights determine polar angles, attribute values determine polar radius. k-meansalgorithm randomly select initial centers and can not get the optimal solution, and themethod of optimized initial centers is given. And the algorithm needs to be given thenumber of clusters in advance, but it is actually very difficult, an improved method usingof cyclic and expert supervision is put forward. In the visualization clustering methodbased on self-organizing map(SOM), nerve in the traditional rectangular or hexagonalgrid element mapping is changed into the radar chart mapping to solve the traditionalSOM can not reflect the real disparities between the data points; The algorithm accelerateconvergence by increasing the lateral contraction force and then reconstructing theweighted vector; proposed to winning neuron to the neighborhood neurons pitch relatedmonotonically decreasing function as a correction value of the adaptive learning speedimprovement, to increase the stability of the algorithm and convergence ti Two novelmethods are proposed in the study of the visualization clustering of complex simulationdata. In the visualization clustering method based on the improved radar chart, thetraditional radar chart is improved to highlight the characteristics of the data, whichattribute weights determine polar angles, attribute values determine polar radius. k-meansalgorithm randomly select initial centers and can not get the optimal solution, and themethod of optimized initial centers is given. And the algorithm needs to be given thenumber of clusters in advance, but it is actually very difficult, an improved method usingof cyclic and expert supervision is put forward. In the visualization clustering methodbased on self-organizing map(SOM), nerve in the traditional rectangular or hexagonal grid element mapping is changed into the radar chart mapping to solve the traditionalSOM can not reflect the real disparities between the data points; The algorithm accelerateconvergence by increasing the lateral contraction force and reconstructing the weightedvector; The monotonically decreasing function with the distances between winningneuron to neighborhood neuron was induced to adaptively correct the learning speed, andthen to increase the stability and accelerate convergence. The experiments prove that thealgorithm has higher efficiency and robustness.
     The paper enriches dimension reduction of high-dimensional data, visualization datamining method, and provides the technical support for complex simulation data analysismethods.
引文
[1]吴重光.仿真技术[M].北京:化学工业出版社,2000:117-170.
    [2] Law A M, Kelton W D. Simulation Modeling and Analysis[M]. New York: McGraw-HillCollege,2007:10-21.
    [3]刘晓平,唐益明,郑利平.复杂系统与复杂系统仿真研究综述[J].系统仿真学报,2008,20(23):6303-6312.
    [4]田萌.基于数据挖掘的复杂仿真数据分析方法研究[D].哈尔滨:哈尔滨工业大学,2010:3-4.
    [5] Bai J, Wang G H, Xiu J J, et al. New Deghosting Method Based on Generalized Rriangulation[J]. Journal of Systems Engineering and Electronics,2009,20(3):504-511.
    [6] Wu Y T. Computational Methods for Efficient Structural Reliability and Reliability SensitivityAnalysis[J]. AIAA journal,2012,32(8):543-546
    [7] Feingold G, Koren I, Wang H, et al. Precipitation-generated Oscillations in Open CellularCloud Fields[J]. Nature,2010,466(7308):849-852.
    [8] Chen J. Comparison Analysis for Validating Methods of System Smodels[C]//In ElectricInformation and Control Engineering (ICEICE),2011International Conference on. IEEE,2011:232-235.
    [9] Wright R D. Validating Dynamic Models: An Evaluation of Tests of Predictive Power [C].Proc.of the1979Summer Computer Simulation Conference,1979:1286~1296.
    [10] Law A M. How to Build Valid and Credible Simulation Models[C]. Proceedings of the IEEE2009Winter Simulation Conference,2009:24-33.
    [11]焦松,李伟,马萍,等.仿真实验数据一体化分析工具研究[J].系统仿真学报,2011,23(8):1734-1738.
    [12] Han J W, Kamber M. Data Mining: Concepts and Techniques [M].2th Edition.北京:机械工业出版社,2006:1-27.
    [13]邓桂龙,刘智慧,贾志东.作战仿真实验数据关联规则挖掘研究[J].军事运筹与系统工程,2008,4
    [14]敖富江,戚宗锋,陈彬,等.数据流挖掘技术及其在仿真中的应用[J].计算机科学,2009,36(3):116-118.
    [15]俞安琪.仿真数据并行分布式挖掘算法研究[D].哈尔滨:哈尔滨工业大学,2011:46-47.
    [16]李云燕.仿真数据相关性分析方法研究[D].哈尔滨:哈尔滨工业大学,2011:37-51.
    [17] Altuntas S, Selim H. Facility Layout Using Weighted Association Rule-Based Data MiningAlgorithms: Evaluation with Simulation [J]. Expert Systems with Applications,2012,39(1):3-13.
    [18] Mooney B L, Corrales L R and Clark A E. MoleculaRnetworks: An Integrated Graph Theoreticand Data Mining Tool to Explore Solvent Organization in Molecular Simulation [J]. Journal ofComputational Chemistry,2012,33(8):853–860.
    [19] Kambhatla N and Leen T K. Dimension Reduction by Local Principal Component Analysis[J].Neural Computation,1997,9(7):1493–1516.
    [20] Biswas S, Bowyer K W, Flynn P J. Multidimensional Scaling for Matching Low-ResolutionFace Images[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,2012,34(10):2019-2030.
    [21] Lu Y, Cao P, Sun J, et al. Using independent component analysis to remove artifacts in visualcortex responses elicited by electrical stimulation of the optic nerve[J]. Journal of neuralengineering,2012,9(2):026002.
    [22] Lee D D and Seung H S. Learning the Parts of Objects with Nonnegative Matrix Factorization[J]. Nature1999,401:788-791.
    [23] Seung H S and Lee D D.The Manifold Ways of Perception[J].Science,2000,290:2268--2269.
    [24] Tenenbaum J, Silva V D and Langford J. A Global Geometric Framework for NonlinearDimension Reduction [J]. Science,2000,290:2319-2323.
    [25] Roweris S and Saul L K. Nonlinear Dimensionality Reduction by Locally Linear Embedding [J].Science,2000,290:2323-2326.
    [26] Belkin M. Laplacian Eigenmaps for Dimensionality Reduction and Data Representation[J].Neural Computation,2003,15(6):1373-1396.
    [27] Donoho D and Grimes C. Hessian Eigenmaps: New Tools for Nonlinear DimensionalityReduction[C]. Proceeding of National Academy of Science,2003:5591-5596.
    [28] Zhang Z Y and Zha H Y. Principal Manifold and Nonlinear Dimensionality Reduction viaTangent Space Alignment[J]. SIAM Journal of Scientific Computing,2004,26(1):313-338.
    [29] Lafon S and Lee A B. Diffusion Maps and Coarse-graining: A Unified Framework forDimensionality Reduction, Graph Partitioning, and Data Set Parameterization[J]. IEEETransactions on Pattern Analysis and Machine Intelligence,2006,28(9):1393–1403.
    [30] Lin T, Zhang H B. Riemannian Manifold Learning[J]. IEEE Transactions on Pattern Analysisand Machine Intelligence,2008,30(5):796-809.
    [31] Lin G j, Xie M. Face Recognition Based on Riemannian Manifold Learning[C]. Proc. ofInternational Conference on Computational Problem-Solving (ICCP),2011:55-59.
    [32] Zha H Y, Zhang Z Y, Spectral Properties of the Alignment Matrices in manifoldlearning[J].SIAM Review,2009,51(3):454–566.
    [33] Van der Maaten L J P, Postma E O and Van den Herik H J. Dimensionality reduction: AComparative Review[R]. Tilburg University Technical Report,2009:54-62.
    [34] Wang R P, Shan S G, Chen X L, et al. Maximal Linear Embedding for DimensionalReduction[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,2011,33(9):1776-1792.
    [35] Bengio Y, Paiement J F, Vincent P, et al. Out-of-sample Extensions for LLE, Isomap, MDS,Eigenmaps, and Spectral Clustering[J]. Advances in Neural Information Processing Systems,2004,16:177-184.
    [36] Ham J, Lee D D, Mika S. A Kernel View of the Dimensionality Reduction of Manifolds[C].Proc. of the21th international conference on Machine learning(ICML04’),2004:47-54.
    [37]赵连伟,罗四维,赵艳敞,等.高维数据流行的低维嵌入及嵌入维数研究[J].软件学报,2005,16(8):1423-1430.
    [38] Law M H C and Jain A K. Incremental Nonlinear Dimensionality Reduction by ManifoldLearning.[J] IEEE Transactions on Pattern Analysis and Machine Intelligence,2006,28(3):377-391.
    [39] Zhao D F, Yang L. Incremental Isometric Embedding of High-Dimensional Data UsingConnected Neighborhood Graphics[J]. IEEE Transactions on Pattern Analysis and MachineIntelligence,2009,31(1):86-98.
    [40]王耀南,张莹,李春生.基于核矩阵的Isomap增量学习算法研究[J].计算机研究与发展,2009,46(9):1515-1522
    [41]孟德宇,徐晨,徐宗本.基于Isomap的流形结构重建方法[J].计算机学报,2010,33(3):545-554.
    [42]杨剑,李伏欣,王珏.一种改进的局部切空间排列算法[J].软件学报,2005,16(9):1584-1590.
    [43]文贵华,江丽君,文军.邻域参数动态变化的局部线性嵌入[J].软件学报,2008,19(7):1666-1673.
    [44]闫德勤,刘胜蓝.基于局部切空间偏离度的自适应邻域选取算法[J].模式识别与人工智能,2010,23(6):815-821.
    [45]詹宇斌,殷建平,刘新旺,等.流形学习中基于局部线性结构的自适应邻域选择[J].计算机研究与发展,2011,48(4):576-583.
    [46] Wang J, Zhang H Y. Adaptive Manifold Learning[J]. IEEE Transactions on Pattern Analysisand Machine Intelligence,2012,34(2):253-265.
    [47] Zhang Z Y and Wang J, MLLE: Modified Locally Linear Embedding using Multiple Weights[J].Advances in Neural Information Processing Systems,2007,19:1593-1600.
    [48] Hou C P, Zhang C S, Wu Y, et al. Stable Local Dimensionality Reduction Approaches[J].Pattern Recognition.2009,42(9):2054-2066.
    [49] Zhan Y B, Yin J P. Robust Local Tangent Space Alignment via Interactive Weighted PCA[J].NeuroComputing,2011,74(11):1985-1993.
    [50] Wang J, Zhang Z Y. Nonlinear Embedding Preserving Multiple Local-Linearities[J]. PatternRecognition,2010,43(4):1257-1268.
    [51] Ridder D, Kouropteva O, Okun O, et al. Supervised Locally Linear Embedding [J]. ArtificialNeural Networks and Neural Information Processing—ICANN/ICONIP2003Proceedings,2003,2714:333-341.
    [52] Zhao L X, Zhang Z Y, Supervised Locally Linear Embedding with Probability-based Distancefor Classification [J]. Computers and Mathematics with Applications,2009,57(6):919-926.
    [53] Zhang S Q. Enhanced Supervised Locally Linear Embedding[J]. Pattern RecognitionLetters,2009,30(13):1208-1218.
    [54] Belkin M, Niyogi P, Semi-Supervised Learning on Riemannian Manifolds[J]. Journal ofMachine Learning Research,2004:209-239.
    [55] Song Y Q, Nie F P, Zhang C S, et al. A Unified Framework for Semi-SupervisedDimensionality Reduction[J]. Pattern Recognition,2008,41(9):2789–2799.
    [56] Zhang Z Y, Zha H Y, Zhang M. Spectral Methods for Semi-supervised Manifold Learning[C],IEEE Conference on Computer Vision and Pattern,2008:1-6.
    [57]何力,张军平,周志华.基于放大因子和延伸方向研究流形学习算法[J].计算机学报,2005,28(12):2000-2009.
    [58] Lespinats S, Fertil B, Villemain P, et al. RankVisu: Mapping from the NeighborhoodNetwork[J]. Neurocomputing,2009,72(13):2964-2978.
    [59] Mo D, Huang S H. Fractal-Based Intrinsic Dimension Estimation and Its Application inDimensionality Reduction[J]. IEEE Transactions on Knowledge and Data Engineering,2012,24(1):59-71.
    [60] Burske J. and Sommer G. Intrinsic Dimensionality Estimation with Optimally TopologyPreserving Maps [J]. IEEE Transaction on Pattern Analysis and Machine Intelligence,1998,20(5):572一575.
    [61] Srivastava J N. An Information Function Approach to Dimensionality Analysis and CurvedManifold Clustering [M]. In Multivariate Analysis III. New York: Academic Press,1973:369-382.
    [62] Strandlie A, Frühwirth R. Track and Vertex Reconstruction: From Classical to AdaptiveMethods[J]. Reviews of Modern Physics,2010,82(2):1419.
    [63] Xie X, Cao Z, Weng X, et al. Estimating Intrinsic Dimensionality of fMRI DatasetIncorporating an AR(1) Noise Model with Cubic Spline Interpolation[J]. Neurocomputing,2009,72(4):1042-1055.
    [64] Pettis K, Bailey T, Jain A K, et al. An Intrinsic Dimensionality Estimator from Near-neighborInformation [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,1979,1(1):25-27.
    [65] Francisco C. Estimating the Intrinsic Dimension of Data with a Fractal-Based Method [J]. IEEETrans. on Pattern Analysis and Machine Intelligence,2002,24(10):1404-1407.
    [66] Twining C J, Taylor C J. Specificity: A Graph-Based Estimator of Divergence[J]. IEEETransactions on Pattern Analysis and Machine Intelligence,2011,33(12):2492-2505.
    [67] Brito M R, Quiroz A J, Yukich J E. Intrinsic Dimension Identification via Graph-theoreticMethods[J]. Journal of Multivariate Analysis,2013,116:263-277.
    [68] Mordohai P, Medioni G. Dimensionality Estimation, Manifold Learning and FunctionApproximation using Tensor Voting[J]. The Journal of Machine Learning Research,2010(11):411-450.
    [69] Levina E and Bicked P. Maximum Likelihood Estimation of Intrinsic Dimension [J]. InAdvances in Neural Information Processing System.Cambridge, MA: MIT Press,2005,17:777-784.
    [70]谭璐,吴翊,易冬云.基于LLE方法的本征维数估计[J].模式识别与人工智能.2006,19(1):7-13.
    [71]刘建.高维数据的本征维数估计方法研究[D].国防科技大学,2005:27-55.
    [72] Fan M, Qiao H, Zhang B. Intrinsic Dimension Estimation of Manifolds by Incising Balls[J],Pattern Recognition,2009,42(5):780–787.
    [73] Ahram T Z, McCauley-Bush P and Karwowski W. Estimating Intrinsic Dimensionality Usingthe Multi-criteria Dcision Weighted Model and the Average Standard Estimator [J]. InformationSciences,2010,180:2845–2855.
    [74] Carter, K M, Raich R, Hero A O. On Local Intrinsic Dimension Estimation and itsApplications[J]. IEEE Trans. on Signal Processing,2010,58(2):650-663.
    [75] Finn W G, Carter K M, Raich R, et al. Analysis of Clinical Flow CytometricImmunophenotyping Data by Clustering on Statistical Manifolds: Treating Flow CytometryData as High-Dimensional Objects [J]. Clinical Cytometry,2009,76B(1):1-7.
    [76] Carter K M, Raich R, Finn W G, et al. Information Preserving Component Analysis: DataProjections for Flow Cytometry Analysis[J]. IEEE Journal of Selected Topics SignalProcess,2009,3(1):148-158.
    [77] Carter K M, Raich R, Hero A O. On Local Intrinsic Dimension Estimation and itsApplications[J]. IEEE Transactions on Signal Processing,2010,58(2):650-663.
    [78]谷瑞军,须文波,刘军伟,等.高维数据固有维数的自适应极大似然估计[J].计算机应用,2008,28(8):2088-2010.
    [79]杨明,余旭初,张鹏强,等.基于AMLE的高光谱本征维数估计方法研究[J].影像技术,2012(3):39-42.
    [80] Bouveyron C, Celeux G and Girard S. Intrinsic Dimension Estimation by Maximum Likelihoodin Isotropic Probabilistic PCA[J]. Pattern Recognition Letters,2011,32:1706-1713.
    [81] Yuan X R, Guo P H, Xiao H, et al. Scattering Points in Parallel Coordinates[J]. IEEETransactions on Viualization and Computer Graphics,2009,15(6):1001-1008.
    [82] Yuan X R, Xiao H, Guo H Q, et al. Scalable Multi-variate Analytics of Seismic and Satellitebased Observational Data [J].IEEE Transactions on Visualization and Computer Graphics,2010,16(3):1413-1420.
    [83] Nowell L, Havre S, Hetzler B, et al. ThemeRiver: Visualizing the Magic Changes in LargeDocument Collections[J]. Transactions on Visualization and Computer Graphics,2002(8):9-20.
    [84] Guo H Q, Wang Z C, Yu B W, et al. TripVista: Triple Perspective Visual Trajectory Analyticsand Its Application on Microscopic Traffic Data at a Road Intersection [C]. Proc. of IEEEPacific Visualization Symposium,2011:1-4.
    [85]叶林瓒,雷小永,戴水岭.飞行仿真系统数据可视化设计与实现[J].计算机仿真,2012,29(3):141-144.
    [86]倪萍,廖建新,朱晓民,等.流数据增量式多维可扩展可视化挖掘方法[J].吉林大学学报(工学版),2011,41(3):817-821.
    [87]钱爱玲.复杂结构的时间序列数据挖掘与预测方法研究[D].华中科技大学学位论文,2011:56-87.
    [88]詹宇斌,殷建平,刘新旺.局部切空间对齐算法的核主成分分析解释[J].计算机工程与科学,2010,32(6):158-161.
    [89] Yeung D Y, Chang H, Dai G. Learning the Kernel Matrix by Maximizing a KFD-Based ClassSeparability Criterion[J]. Pattern Recognition,2007,40(7):2021-2028.
    [90] Weinberger K Q, Sha F, Saul L K. Learning a Kernel Matrix for Nonlinear Dimensionalityreduction[C]//ACM. Proceedings of the Twenty-first International Conference on MachineLearning.,2004:106.
    [91] Bruske J, Sommer G. Intrinsic Dimensionality Estimation with Optimally Topology PreservingMaps [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,1998,20(5):572-575.
    [92] Kégl B. Intrinsic Dimension Estimation Using Packing Numbers[J]. Advances in neuralinformation processing systems,2002,15:681-688.
    [93] Bennet R.S. The Intrinsic Dimensionality of Signal Collections[J]. IEEE Transactions onInformation Theory,15(5),1969:517-525.
    [94] Kanungo T, Mount D M, Netanyaha N S, et a1. An Efficient k-means Clustering Algorithm:Analysis and Implementation[J]. IEEE Trans on Pattern Analysis and Machine Intelligence,2002,24(7):881-892.
    [95]单玉双,邢长征.一种更有效的k-means聚类算法[J].计算机系统应用,2009,8:96-99.
    [96]黄震华,向阳等.一种进行k-means聚类的有效方法[J].模式识别与人工智能,2010,23(4):516-521.
    [97]汪中,刘贵全,陈恩红.一种优化初始中心点的k-means算法[J].模式识别与人工智能,2009,22(2):299-304.
    [98] Kalyani S, Swarup K S. Particle Swarm Optimization Based k-means Clustering Approach forSecurity Assessment in Power Systems[J]. Expert Systems with Applications,2011,38(9):10839-10846.
    [99] Damodar R and Prasanta K J. Initialization for k-means Clustering using Voronoi Diagram[J].Procedia Technology,2012,4:395-400.
    [100]宋佳霖,孟辉,王立强,等.一种多元图图形特征向球面坐标映射的可视化方法[J].燕山大学学报,2010,34(2):115-118.
    [101] Kohonen T. Self-organized Formation of Topologically Correct Feature Map[J]. BiologicalCybernetics,1982,43:56–69.
    [102]焦李成.神经网络系统理论[M].西安电子科技大学出版社,1996:92-94.
    [103] Yin H, Data Visualization and Manifold Mapping Using the ViSOM[J]. IEEE Transaction onNeural Networks,2002,13(1):1005-1016.
    [104] Xu L, Xu Y, Chow T. PolSOM: A New Method for Multidimensional Data Visualization[J].Pattern Recognition,2010,43:1668–1675.
    [105]郭辉,徐浩军,刘凌.基于区间数的预警机作战效能评估[J].系统工程与电子技术,2010(5):1007-1010.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700