基于多层特征融合可调监督函数卷积神经网络的人脸性别识别
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Face gender recognition based on multi-layer feature fusion convolution neural network with adjustable supervisory function
  • 作者:石学超 ; 周亚同 ; 池越
  • 英文作者:Shi Xuechao;Zhou Yatong;Chi Yue;Tianjin Key Laboratory of Electronic Materials & Devices,School of Electronics & Information Engineering,Hebei University of Technology;
  • 关键词:人脸性别识别 ; 多层特征融合 ; 卷积神经网络 ; 深度学习
  • 英文关键词:face gender recognition;;multi-layer feature fusion;;convolution neural network(CNN);;deep learning
  • 中文刊名:JSYJ
  • 英文刊名:Application Research of Computers
  • 机构:河北工业大学电子信息工程学院天津市电子材料与器件重点实验室;
  • 出版日期:2018-02-09 12:30
  • 出版单位:计算机应用研究
  • 年:2019
  • 期:v.36;No.329
  • 基金:国家自然科学基金资助项目(61401307);; 河北省科学技术研究与发展项目(11213565);; 河北省引进留学人员资助项目(CL201707)
  • 语种:中文;
  • 页:JSYJ201903061
  • 页数:5
  • CN:03
  • ISSN:51-1196/TP
  • 分类号:307-311
摘要
为了进一步提高性别识别的准确率,提出了一种基于多层特征融合与可调监督函数机制结合的卷积神经网络(L-MFCNN)模型,并将之用于人脸性别识别。与传统卷积神经网络(CNN)不同,L-MFCNN将多个浅层中间卷积层特征输出与最后卷积层特征输出相结合,融合多层卷积层的特征,不仅利用了深层卷积的整体语义信息,还考虑了浅层卷积的细节局部纹理信息,使得性别识别更加准确。此外L-MFCNN还引入具有可调目标监督函数机制的large-margin softmax loss作为输出层,利用其调节不同的间隔(margin)的机制来有效引导深层卷积网络学习,使得同种性别间的类内间距更小,不同性别间的类间距更大,获得更好的性别识别效果。在多个人脸数据集上的性别识别实验结果表明,L-MFCNN的识别准确率要高于其他传统的卷积网络模型。L-MFCNN模型也为将来的人脸性别识别研究提供了新的思路与方向。
        In order to further improve the accuracy of gender recognition,this paper proposed the convolution neural network model based on multi-layer feature fusion with adjustable supervisory function,L-MFCNN,then used it for face gender recognition. Unlike the traditional convolution neural network,L-MFCNN combined the output of multiple shallow convolution layers with the final convolution layer output. Fusion the characteristics of multi-layer convolutions,not only used the high-level semantic information,but also considered the bottom of the details of the texture information,making the face gender recognition more accuracy. While using the large-margin softmax loss could adjust the margin function,it could explicitly encourages the same gender intra-class compactness and the different gender inter-class separability to get better face gender recognition. The face gender recognition experiment data on multiple face data sets show that the accuracy of L-MFCNN is higher than that of traditional convolution network. Besides,L-MFCNN also provides the new ideas and directions for the future gender recognition of face.
引文
[1] Le Cun Y,Bengio Y,Hinton G. Deep learning[J]. Nature,2015,521(7533):436-444.
    [2] Golomb B A,Lawrence D T,Sejnowski T J. SexNet:a neural networkidentifies sex from human faces[C]//Proc of Advances in Neural In-formation Processing Systems. 1991:572-579.
    [3] Brunelli R,Poggio T. Hyber BF networks for gender classification[C]//Proc of IEEE International Conference on Acoustics,Speech,and Signal Processing. Piscataway,NJ:IEEE Press,2007.
    [4] Tamura S,Kawai H,Mitsumoto H. Male/female identification from 8×6very low resolution face images by neural network[J]. Pattern Recog-nition,1996,29(2):331-335.
    [5] Osuna E,Freund R,Girosi F. Training support vector machines:an ap-plication to face detection[C]//Proc of IEEE Computer Society Con-ference on Computer Vision and Pattern Recognition. Piscataway,NJ:IEEE Press,2002:130-136.
    [6] Farfade S S,Saberian M J,Li L J. Multi-view face detection usingdeep convolutional neural networks[C]//Proc of the 5th ACM Inter-national Conference on Multimedia Retrieval. New York:ACM Press,2015:643-650.
    [7] Liao Shengcai,Jain A K,Li S Z. A fast and accurate unconstrainedface detector[J]. IEEE Trans on Pattern Analysis and MachineIntelligence,2016,38(2):211-223.
    [8] Zhang Kaipeng,Zhang Zhanpeng,Li Zhifeng,et al. Joint face detectionand alignment using multitask cascaded convolutional networks[J].IEEE Signal Processing Letters,2016,23(10):1499-1503.
    [9] Viola P,Jones M J. Robust real-time face detection[J]. InternationalJournal of Computer Vision,2004,57(2):137-154.
    [10]Sun Yi,Liang Ding,Wang Xiaogang,et al. DeepID3:face recognitionwith very deep neural networks[EB/OL].(2015-02-03). https://arxiv. org/abs/1502. 00873.
    [11]Long J,Shelhamer E,Darrell T. Fully convolutional networks for se-mantic segmentation[C]//Proc of IEEE Conference on Computer Vi-sion and Pattern Recognition. Piscataway,NJ:IEEE Press,2015:3431-3440.
    [12]Levi G,Hassner T. Age and gender classification using convolutionalneural networks[C]//Proc of IEEE Conference on Computer Visionand Pattern Recognition Workshops. Piscataway,NJ:IEEE Press,2015:34-42.
    [13]Rothe R,Timofte R,Van Gool L. Deep expectation of real and apparent age from a single image without facial landmarks[J]. International Journal of Computer Vision,2018,126(2-4):144-157.
    [14] Verma A,Vig L. Using convolutional neural networks to discovercogntively validated features for gender classification[C]//Proc ofIEEE International Conference on Soft Computing and Machine Intelli-gence. Piscataway,NJ:IEEE Press,2014:33-37.
    [15]汪济民,陆建峰.基于卷积神经网络的人脸性别识别[J].现代电子技术,2015,38(7):81-84.(Wang Jimin,Lu Jianfeng. Facegender recognition based on convolutional neural network[J].Modern Electronics Technique,2015,38(7):81-84.)
    [16]董兰芳,张军挺.基于深度学习和随机森林的人脸年龄和性别分类研究[J].计算机工程,2018,44(5):246-251.(Dong Lanfang,Zhang Junting. A study of face age and gender classification usingdeep learning and random forest[J]. Computer Engineering,2018,44(5):246-251.)
    [17]张婷,李玉鑑,胡海鹤,等.基于跨连卷积神经网络的性别分类模型[J].自动化学报,2016,42(6):858-865.(Zhang Ting,Li Yu-jian,Hu Haihe,et al. A gender classification model based on cross-connected convolutional neural networks[J]. Acta Automatica Sini-ca,2016,42(6):858-865.)
    [18]Liu Weiyang,Wen Yandong,Yu Zhiding,et al. Large-margin softmaxloss for convolutional neural networks[C]//Proc of International Con-ference on International Conference on Machine Learning. 2016:507-516.
    [19]LéCun Y,Bottou L,Bengio Y,et al. Gradient-based learning appliedto document recognition[J]. Proceedings of the IEEE,1998,86(11):2278-2324.
    [20] Shen Wei,Wang Xinggang,Wang Yan,et al. Deep Contour:a deepconvolutional feature learned by positive-sharing loss for contour de-tection[C]//Proc of IEEE Conference on Computer Vision and Pat-tern Recognition. Piscataway,NJ:IEEE Press,2015:3982-3991.
    [21]Mansanet J,Albiol A,Paredes R. Local deep neural networks for gen-der recognition[J]. Pattern Recognition Letters,2016,70(1):80-86.
    [22]Hassner T,Harel S,Paz E,et al. Effective face frontalization in un-constrained images[C]//Proc of IEEE Conference on Computer Vi-sion and Pattern Recognition. Piscataway,NJ:IEEE Press,2015:4295-4304.
    [23]Glorot X,Bordes A,Bengio Y. Deep sparse rectifier neural networks[C]//Proc of the 14th International Conference on Artificial Intelli-gence and Statistics. 2011:315-323.
    [24]Sanguansat P. Face hallucination using bilateral-projection-based two-dimensional principal component analysis[C]//Proc of IEEE Interna-tional Conference on Computer and Electrical Engineering. Pisca-taway,NJ:IEEE Press,2008:876-880.
    [25] Hightower J,Borriello G. Location systems for ubiquitous computing[J]. Computer,2001,34(8):57-66.
    [26]Shen Xiaohui,Lin Zhe,Brandt J,et al. Detecting and aligning faces byimage retrieval[C]//Proc of IEEE Conference on Computer Visionand Pattern Recognition. Piscataway,NJ:IEEE Press,2013:3460-3467.
    [27] Phillips P J,Wechsler H,Huang J,et al. The FERET database andevaluation procedure for face-recognition algorithms[J]. Image andVision Computing,1998,16(5):295-306.
    [28]Huang G B,Ramesh M,Berg T,et al. Labeled faces in the wild:a da-tabase for studying face recognition in unconstrained environments,Technical Report 07-49[R]. Amherst:University of Massachusetts,2007.
    [29] Guo Yandong,Zhang Lei,Hu Yuxiao,et al. Ms-celeb-1m:a datasetand benchmark for large-scale face recognition[C]//Proc of EuropeanConference on Computer Vision. Berlin:Springer International Pub-lishing,2016:87-102.
    [30] Zeiler M D,Fergus R. Visualizing and understanding convolutionalnetworks[EB/OL]. 2013-11-12. https://arxiv. org/abs/1311. 2901.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700