基于数据分布的标签噪声过滤
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Label noise filtering based on the data distribution
  • 作者:陈庆强 ; 王文剑 ; 姜高霞
  • 英文作者:CHEN Qingqiang;WANG Wenjian;JIANG Gaoxia;School of Computer and Information Technology,Shanxi University;Key Laboratory of Computation Intelligence and Chinese Information Processing of Ministry of Education,Shanxi University;
  • 关键词:标签噪声 ; 噪声过滤 ; 模型鲁棒性 ; 数据分布
  • 英文关键词:label noise;;noise filtering;;robust modeling;;data distribution
  • 中文刊名:QHXB
  • 英文刊名:Journal of Tsinghua University(Science and Technology)
  • 机构:山西大学计算机与信息技术学院;山西大学计算智能与中文信息处理教育部重点实验室;
  • 出版日期:2018-12-21 16:54
  • 出版单位:清华大学学报(自然科学版)
  • 年:2019
  • 期:v.59
  • 基金:国家自然科学基金资助项目(61673249);; 山西省回国留学人员科研基金资助项目(2016-004);; 赛尔网络下一代互联网技术创新项目(NGII20170601)
  • 语种:中文;
  • 页:QHXB201904003
  • 页数:8
  • CN:04
  • ISSN:11-2223/N
  • 分类号:17-24
摘要
在监督学习中,标签噪声对模型建立有较大的影响。目前对于标签噪声的处理方法主要有基于模型预测的过滤方法和鲁棒性建模方法,然而这些方法存在过滤效果差或者过滤效率低等问题。针对该问题,该文提出一种基于数据分布的标签噪声过滤方法。首先对于数据集中的每一个样本,根据其近邻内样本的分布,将其及邻域样本形成的区域划分为高密度区域和低密度区域,然后针对不同的区域采用不同的噪声过滤规则进行过滤。与已有方法相比,该方法从数据分布角度出发,使得噪声过滤更具有针对性从而提高过滤效果;此外,使用过滤规则对噪声数据进行处理而非建立噪声预测模型,因而可以提高过滤效率。在15个UCI标准多分类数据集上的实验结果表明:该方法在噪声低于30%时,噪声检测效率和分类精度均有很好的表现。
        Label noise can severely influence supervised learning models.Existing methods are mainly based on model predictions and robust prediction modeling.However,these methods are sometimes not effective or efficient.This paper presents a label noise filtering method based on the data distribution.First,the area formed by each sample and the vicinage samples is divided into high density area or low density areas according to the distribution of the vicinage samples.Then,different noise filtering rules are used to deal with the different areas.Thus,this approach takes the data distribution into account so that the label noise filtering is focused on the key data and can avoid over-filtering.Filter rules are used instead of a noise filter forecasting model,which improves the efficiency.Tests on 15 UCI standard multi-class data sets show that this approach is effective and efficient.
引文
[1]FRENAY B,VERLEYSEN M.Classification in the presence of label noise:A survey[J].IEEE Transactions on Neural Networks and Learning Systems,2014,25(5):845-869.
    [2]SEGATA N,BLANZIERI E,DELANY S J,et al.Noise reduction for instance-based learning with a local maximal margin approach[J].Journal of Intelligent Information Systems,2010,35(2):301-331.
    [3]VAN DEN HOUT A,VAN DER HEIJDEN P G M.Randomized response,statistical disclosure control and misclassification:A review[J].International Statistical Review,2002,70(2):269-288.
    [4]YUAN W W,GUAN D H,MA T H,et al.Classification with class noises through probabilistic sampling[J].Information Fusion,2018,41:57-67.
    [5]SABZEVARI M,MARTNEZ-MUOZ G,SUREZ A.Atwo-stage ensemble method for the detection of class-label noise[J].Neurocomputing,2018,275:2374-2383.
    [6]SEZ J A,GALAR M,LUENGO J,et al.INFFC:An iterative class noise filter based on the fusion of classifiers with noise sensitivity control[J].Information Fusion,2016,27:19-32.
    [7]LUENGO J,SHIM S O,ALSHOMRANI S,et al.CNC-NOS:Class noise cleaning by ensemble filtering and noise scoring[J].Knowledge-Based Systems,2018,140:27-49.
    [8]MANWANI N,SASTRY P S.Noise tolerance under risk minimization[J].IEEE Transactions on Cybernetics,2013,43(3):1146-1151.
    [9]LIU T L,TAO D C.Classification with noisy labels by importance reweighting[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2016,38(3):447-461.
    [10]FRIEDMAN J,HASTIE T,TIBSHIRANI R.Additive logistic regression:A statistical view of boosting[J].The Annals of Statistics,2000,28(2):337-374.
    [11]ABELLN J,MASEGOSA A R.Bagging decision trees on data sets with classification noise[C]//The 6th International Symposium Foundations of Information and Knowledge Systems.Sofia,Bulgaria:Springer,2010:248-265.
    [12]BARTLETT P L,JORDAN M I,MCAULIFFE J D.Convexity,classification,and risk bounds[J].Journal of the American Statistical Association,2006,101(473):138-156.
    [13]WILSON D R,MARTINEZ T R.Reduction techniques for instance-based learning algorithms[J].Machine Learning,2000,38(3):257-286.
    [14]WILSON D L.Asymptotic properties of nearest neighbor rules using edited data[J].IEEE Transactions on Systems,Man,and Cybernetics,1972,SMC-2(3):408-421.
    [15]BARANDELA R,GASCA E.Decontamination of training samples for supervised pattern recognition methods[C]//Proceedings of the Joint IAPR International Workshops on Advances in Pattern Recognition.Alicante,Spain:Springer,2000:621-630.
    [16]HART P.The condensed nearest neighbor rule(Corresp.)[J].IEEE Transactions on Information Theory,1968,14(3):515-516.
    [17]CAO J J,KWONG S,WANG R.A noise detection based AdaBoost algorithm for mislabeled data[J].Pattern Recognition,2012,45(12):4451-4465.
    [18]SLUBAN B,GAMBERGER D,LAVRAC N.Ensemble-based noise detection:Noise ranking and visual performance evaluation[J].Data Mining and Knowledge Discovery,2014,28(2):265-303.
    [19]EKAMBARAM R,FEFILATYEV S,SHREVE M,et al.Active cleaning of label noise[J].Pattern Recognition,2016,51:463-480.
    [20]DUA D,KARRA TANISKIDOU E.UCI machine learning repository[EB/OL].[2017-11-05].http://archive.ics.uci.edu/ml.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700