Consistent feature selection and its application to face recognition
详细信息    查看全文
  • 作者:Feng Pan (1)
    Guangwei Song (1)
    Xiaobing Gan (1)
    Qiwei Gu (1)
  • 关键词:Feature selection ; Pattern recognition ; Laplacian matrix ; Eigen ; decomposition
  • 刊名:Journal of Intelligent Information Systems
  • 出版年:2014
  • 出版时间:October 2014
  • 年:2014
  • 卷:43
  • 期:2
  • 页码:307-321
  • 全文大小:797 KB
  • 参考文献:1. Balakrishnama, S., & Ganapathiraju, A. (1998). Linear discriminant analysis-a brief tutorial. / Institute for Signal and information Processing.
    2. Belhumeur, P., Hespanha, J., Kriegman, D., et al. (1997). Eigenfaces vs. fisherfaces: Recognition using class specific linear projection. / IEEE Transactions Pattern Analytical Machinist Intelligence, / 19(7), 711鈥?20. CrossRef
    3. Belkin, M., Niyogi, P., Sindhwani, V. (2006). Manifold regularization: A geometric framework for learning from labeled and unlabeled examples. / Journal of Machine Learning Research, / 7, 2399鈥?434.
    4. Cai, D., He, X., Han, J. (2007). Semi-supervised discriminant analysis. / IEEE 11th International Conference on Computer Vision, 1鈥?.
    5. Cai, D., He, X., Han, J. (2008). Srda: An efficient algorithm for large-scale discriminant analysis. / IEEE Transaction on Knowledge and Data Engineering, / 20(1), 1鈥?2. CrossRef
    6. Cai, D., Zhang, C., He, X. (2010). Unsupervised feature selection for multi-cluster data. In: Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining, pp. 333鈥?42. ACM.
    7. Culp, M., & Michailidis, G. (2008). Graph-based semisupervised learning. / IEEE Transactions on Pattern Analysis and Machine Intelligence, / 30, 174鈥?79. CrossRef
    8. Ding, C., & Peng, H. (2003). Minimum redundancy feature selection from microarray gene expression data. / Computational Systems Bioinformatics Conference, / 0, 523鈥?28.
    9. Efron, B., Hastie, T., Johnstone, I., Tibshirani, R. (2004). Least angle regression. / The Annals of statistics, / 32(2), 407鈥?99. CrossRef
    10. Friedman, J. (1989). Regularized discriminant analysis. / Journal of the American statistical association, 165鈥?75.
    11. Fukunaga, K. (1972). / Introduction to statistical pattern recognition: Academic Press.
    12. Gilad-Bachrach, R., Navot, A., Tishby, N. (2004). Margin based feature selection - theory and algorithms. In: Proceedings of the twenty-first international conferenceon Machine learning, 43. ACM, New York, NY, USA.
    13. Hall, M.A. (2000). / Correlation-based feature selection for discrete and numeric class machine learning. In: Proceedings of the Seventeenth International Conference on Machine Learning, (pp. 359鈥?66). San Francisco: Morgan Kaufmann (Publishers Inc.)
    14. He, X., Cai, D., Han, J. (2008). Learning a maximum margin subspace for image retrieval. / IEEE Transaction on Knowledge and Data Engineering, / 20(2), 189鈥?01. CrossRef
    15. He, X., Cai, D., Niyogi, P. (2006). / Laplacian score for feature selection. In: Advances in Neural Information Processing Systems 18, (pp. 507鈥?14). Cambridge: MIT Press.
    16. Helleputte, T., & Dupont, P. (2009). Partially supervised feature selection with regularized linear models. In: Proceedings of the 26th Annual International Conference on Machine Learning, pp. 409鈥?16. ACM.
    17. Kapoor, A., Grauman, K., Urtasun, R., Darrell, T. (2010). Gaussian processes for object categorization. / International Journal of Computer Vision, / 88(2), 169鈥?88. CrossRef
    18. Kira, K., & Rendell, L.A. (1992). / A practical approach to feature selection. In: Proceedings of the ninth international workshop on Machine learning, (pp. 249鈥?56). San Francisco: Morgan Kaufmann (Publishers Inc.)
    19. Kononenko, I. (1994). Estimating attributes: Analysis and extensions of relief In Bergadano, F., & De Raedt, L. (Eds.), / Machine Learning: ECML-94, Lecture Notes in Computer Science, vol. 784, pp. 171鈥?82. Berlin / Heidelberg: Springer.
    20. Kulis, B., Basu, S., Dhillon, I., Mooney, R. (2009). Semi-supervised graph clustering: a kernel approach. / Machine learning, / 74(1), 1鈥?2. CrossRef
    21. Liu, H., & Yu, L. (2005). Toward integrating feature selection algorithms for classification and clustering. / IEEE Transactions on Knowledge and Data Engineering, / 17, 491鈥?02. CrossRef
    22. Liu, Y., Nie, F., Wu, J., Chen, L. (2013). Efficient semi-supervised feature selection with noise insensitive trace ratio criterion. / Neurocomputing, / 105, 12鈥?8. CrossRef
    23. Quinlan, J. (1993). / C4. 5: programs for machine learning: Morgan kaufmann.
    24. Ren, J., Qiu, Z., Fan, W., Cheng, H., Yu, P. (2008). Forward semi-supervised feature selection. / Advances in Knowledge Discovery and Data Mining, 970鈥?76.
    25. Rodriguez-Lujan, I., Huerta, R., Elkan, C., Santa Cruz, C. (2010). Quadratic programming feature selection. / Journal of Machine Learning Research, / 11, 1491鈥?516.
    26. Sugiyama, M., Id茅, T., Nakajima, S., Sese, J. (2010). Semi-supervised local fisher discriminant analysis for dimensionality reduction. / Machine Learining, / 78(1), 35鈥?1. CrossRef
    27. Wang, J., Jebara, T., Chang, S.F. (2008). Graph transduction via alternating minimization. In: Proceedings of the 25th international conference on Machine learning, pp. 1144鈥?151. ACM, New York, NY, USA.
    28. Weston, J., Elisseeff, A., Sch枚lkopf, B., Tipping, M. (2003). Use of the zero norm with linear models and kernel methods. / The Journal of Machine Learning Research, / 3, 1439鈥?461.
    29. Xu, Z., King, I., Lyu, M., Jin, R. (2010). Discriminative semi-supervised feature selection via manifold regularization. Neural Networks. / IEEE Transactions on, / 21(7), 1033鈥?047.
    30. Yan, S., Xu, D., Zhang, B., Zhang, H.J., Yang, Q., Lin, S. (2007). Graph embedding and extensions: A general framework for dimensionality reduction. / IEEE Transactions on Pattern Analysis and Machine Intelligence, / 29, 40鈥?1. CrossRef
    31. Yang, W., Zhang, S., Liang, W. (2008). A graph based subspace semi-supervised learning framework for dimensionality reduction In Forsyth, D., Torr, P., Zisserman, A. (Eds.), / Computer Vision C ECCV 2008, Lecture Notes in Computer Science, vol. 5303, pp. 664鈥?77. Berlin / Heidelberg: Springer.
    32. Yu, L., & Liu, H. (2004). Efficient feature selection via analysis of relevance and redundancy. / Journal of Machine Learning Research, / 5, 1205鈥?224.
    33. Zhao, J., Lu, K., He, X. (2008). Locality sensitive semi-supervised feature selection. / Neurocomputing, / 71(10), 1842鈥?849. CrossRef
    34. Zhao, Z., & Liu, H. (2007). Semi-supervised feature selection via spectral analysis. / SIAM International Conference on Data Mining.
    35. Zhou, D., Bousquet, O., Lal, T., Weston, J., Scholkopf, B. (2004). Learning with local and global consistency. In: Advances in Neural Information Processing Systems 16: Proceedings of the 2003 Conference, pp. 595鈥?02.
    36. Zhu, X. (2006). Semi-supervised learning literature survey. / world, / 10, 10.
    37. Zhu, X., Ghahramani, Z., Lafferty, J. (2003). Semi-supervised learning using gaussian fields and harmonic functions. / Proceedings of the Nineteenth International Conference on Machine Learning, pp. 912鈥?19.
  • 作者单位:Feng Pan (1)
    Guangwei Song (1)
    Xiaobing Gan (1)
    Qiwei Gu (1)

    1. College of Management, Shenzhen University, Shenzhen, 518060, Guangdong, China
  • ISSN:1573-7675
文摘
In this paper we consider feature selection for face recognition using both labeled and unlabeled data. We introduce the weighted feature space in which the global separability between different classes is maximized and the local similarity of the neighboring data points is preserved. By integrating the global and local structures, a general optimization framework is formulated. We propose a simple solution to this problem, avoiding the matrix eigen-decomposition procedure which is often computationally expensive. Experimental results demonstrate the efficacy of our approach and confirm that utilizing labeled and unlabeled data together does help feature selection with small number of labeled samples.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700