摘要
图像显著性检测能够获取一幅图像的视觉显著性区域,是计算机视觉的研究热点之一。提出一种结合颜色特征和对比度特征的图像显著性检测方法。首先构造图像在HSV空间的颜色函数以获取图像颜色特征;然后使用SLIC超像素分割算法对图像进行预处理,基于超像素块的对比度特征计算图像显著性;最后将融合颜色特征和对比度特征的显著图经过导向滤波优化形成最终的显著图。使用本文算法在公开数据集MSRA-1000上进行图像显著性检测,并与其他6种算法进行比较。实验结果表明本文算法结合了图像像素点和像素块的信息,检测的图像显著性区域轮廓更加完整,优于其他方法。
Saliency detection can obtain the region of an image that human eyes are interested in,which is one of the important research hotspots of computer vision.In this paper,a new method was proposed to detect salient regions in images that combines color features and global contrast features.Firstly,the image color function was constructed in the HSV space to obtain the image color features.Then the image was preprocessed by the SLIC superpixel segmentation algorithm,and the image saliency is calculated based on the contrast features of the superpixels block.Finally,the merged saliency map was optimized by guided filter to form the final saliency map.The algorithm was proposed from the open database MSRA-1000 for image saliency detection,and compared with other six algorithms.Experimental results show that the proposed algorithm combines the information of image pixels and pixel blocks,and the detected image salient region contour is more complete.
引文
[1] Kanan C,Cottrell G.Robust classification of objects,faces,and flowers using natural image statistics[C]//IEEE Conf.on Computer Vision and Pattern Recognition,2010:2472-2479.
[2] Schmid C,Jurie F,Sharma G.Discriminative spatial saliency for image classification[C]//IEEE Conf.on Computer Vision and Pattern Recognition,2012:3506-3513.
[3] Zhao R,Ouyang W,Wang X.Unsupervised salience learning for person re-identification[C]//IEEE Conf.on Computer Vision and Pattern Recognition,2013:3586-3593.
[4] Wu H,Li G, Luo X. Weighted attentional blocks for probabilistic object tracking[J].Visual Computer,2014,30(2):229-243.
[5] Itti L,Koch C,Niebur E.A model of saliency-based visual attention for rapid scene analysis[J].IEEE Trans.on Pattern Analysis and Machine Intelligence,1998,20(11):1254-1259.
[6] Harel J,Koch C,Perona P.Graph-based visual saliency[J].Adv.in Neural Information Proc.Systems,2007(19):545-552.
[7] Hou X,Zhang L.Saliency detection:a spectral residual approach[C]//IEEE Conf.on Computer Vision and Pattern Recognition,2007.
[8] Achanta R,Estrada F,Wils P,et al.Salient region detection and segmentation[J].Lecture Notes in Computer Science,2008:66-75.
[9] Achanta R,Hemami S,Estrada F,et al.Frequency-tuned salient region detection[C]//IEEE Inter.Conf.on Computer Vision and Pattern Recognition,2009:1597-1604.
[10] Goferman S,Zelnik-Manor L,Tal A.Context-aware saliency detection[C]//IEEE Conf.on Computer Vision and Pattern Recognition,2010:2376-2383.
[11] Kim J,Han D,Tai Y W,et al.Salient region detection via high-dimensional color transform[J].IEEE Trans.on Image Proc.,2015,25(1):9-23.
[12] Zhou Q,Ma L,Celenk M,et al.Content-based image retrieval based on ROI detection and relevance feedback[J].Multimedia Tools&Applications,2005,27(2):251-281.
[13] Huang C,Liu Q,Yu S.Regions of interest extraction from color imagebasedonvisualsaliency[J]. J. of Supercomputing,2011,58(1):20-33.
[14] Achanta R,Shaji A,Smith K,et al.SLIC superpixels compared to state-of-the-art superpixel methods[J].IEEE Trans.on Pattern Analysis&Machine Intelligence,2012,34(11):2274-2282.
[15] Margolin R,Tal A,Zelnik-Manor L.What makes a patch distinct?[C]//2013IEEE Conf.on Computer Vision and Pattern Recognition,2013:1139-1146.
[16] Gonzalez-diaz I,Boujut H.Saliency-based object recognition in video[C]//Proc.of 21st ACM Inter. Conf.on Multimedia,2013.
[17]冯海永,高美凤.基于SLIC超像素和贝叶斯框架的显著性区域检测[J].小型微型计算机系统,2016,37(10):2351-2354.Feng Haiyong,Gao Meifeng.Salient region detection based on SLIC superpixel and Bayesian framework[J].J.of Chinese Computer Systems,2016,37(10):2351-2354.
[18] He K,Sun J,Tang X.Guided image filtering[J].IEEE Trans.on Pattern Analysis and Machine Intelligence,2013,35(6):1397-1409.
[19] Cheng M M,Zhang G X,Niloy J,et al.Global contrast based salient region detection[C]//IEEE Inter.Conf.on Computer Vision and Pattern Recognition,2011:409-416.