SOM and Feature Weights Based Method for Dimensionality Reduction in Large Gauss Linear Models
详细信息    查看全文
  • 作者:Fernando Pav贸n (7)
    Jes煤s Vega (8)
    Sebasti谩n Dormido Canto (9)

    7. GAMCO S.L.
    ; Alcal谩 20 ; 28014 ; Madrid ; Spain
    8. Laboratorio Nacional de Fusi贸n
    ; CIEMAT ; Avenida Complutense 40 ; 28040 ; Madrid ; Spain
    9. Departamento Inform谩tica Y Autom谩tica
    ; UNED ; 28040 ; Madrid ; Spain
  • 关键词:Linear regression ; Feature election ; SOM ; Artificial neural networks ; Weighted euclidean distance
  • 刊名:Lecture Notes in Computer Science
  • 出版年:2015
  • 出版时间:2015
  • 年:2015
  • 卷:9047
  • 期:1
  • 页码:376-385
  • 全文大小:189 KB
  • 参考文献:1. Oja, E (1992) Principal Components. Minor Components, and Linear Neural Networks Neural Networks 5: pp. 927-935 CrossRef
    2. Hyv盲rinen, A, Oja, E (2000) Independent component analysis: algorithms and applications. Neural Networks 13: pp. 411-430 CrossRef
    3. Castellano, G, Fanelli, AM (1999) Feature selection: a neural approach. Neural Networks 5: pp. 3156-3160 CrossRef
    4. Kohonen, T (2013) Essentials of the self-organizing map. Neural Networks 37: pp. 52-65 CrossRef
    5. Theodoridis, S., Koutroumbas, K.: Pattern Recognition. 3rd edition. Academic Press (2006)
    6. Wu, J.-L., Li, I.-J.: The Improved SOM-Based Dimensionality Reducton Method for KNN Classifier Using Weighted Euclidean Metric. International Journal of Computer, Consumer and Control (IJ3C) 3(1) (2014)
    7. Nathans, L.L., Oswald, F.L., Nimon, K.: Interpreting Multiple Linear Regression: A Guidebook of Variable Importance. Practical Assessment, Research & Evaluation 17(9) (2012)
    8. Haykin, S.: Neural Networks: A Comprehensive Foundation. Prentice Hall (1999)
    9. Yuan, J-L, Fine, TL (1998) Neural-network design for small training sets of high dimension. IEEE Transactions on Neural Networks 9: pp. 266-280 CrossRef
  • 作者单位:Statistical Learning and Data Sciences
  • 丛书名:978-3-319-17090-9
  • 刊物类别:Computer Science
  • 刊物主题:Artificial Intelligence and Robotics
    Computer Communication Networks
    Software Engineering
    Data Encryption
    Database Management
    Computation by Abstract Devices
    Algorithm Analysis and Problem Complexity
  • 出版者:Springer Berlin / Heidelberg
  • ISSN:1611-3349
文摘
Discovering the most important variables is a crucial step for accelerating model building without losing potential predictive power of the data. In many practical problems is necessary to discover the dependant variables and the ones that are redundant. In this paper an automatic method for discovering the most important signals or characteristics to build data-driven models is presented. This method was developed thinking in a very high dimensionality inputs spaces, where many variables are independent, but existing many others which are combinations of the independent ones. The base of the method are the SOM neural network and a method for feature weighting very similar to Linear Discriminant Analysis (LDA) with some modifications.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700