多策略自适应大规模本体映射算法
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Multi-strategy Adaptive Large-scale Ontology Mapping Algorithm
  • 作者:蒋猛 ; 禹明刚 ; 王智学
  • 英文作者:JIANG Meng;YU Minggang;WANG Zhixue;College of Command and Control Engineering,The Army Engineering University of PLA;
  • 关键词:大数据 ; 大规模本体映射 ; 模块化 ; 局部置信度 ; 自适应
  • 英文关键词:big data;;large-scale ontology mapping;;modularity;;local confidence;;self-adaption
  • 中文刊名:JSJC
  • 英文刊名:Computer Engineering
  • 机构:陆军工程大学指挥控制工程学院;
  • 出版日期:2019-03-15
  • 出版单位:计算机工程
  • 年:2019
  • 期:v.45;No.498
  • 基金:国家自然科学基金(61802428)
  • 语种:中文;
  • 页:JSJC201903003
  • 页数:6
  • CN:03
  • ISSN:31-1289/TP
  • 分类号:20-25
摘要
大数据背景下大规模本体映射的时间复杂度较高,效率和精度较低。为此,提出一种基于模块化和局部置信度的多策略自适应大规模本体映射算法。对本体内部进行聚类和模块化,基于信息检索策略发现模块间高相似度的相关子本体,计算相关子本体间各映射策略下的局部置信度,在组合映射结果时基于局部置信度对相应策略的权值进行自适应调整。在此基础上,利用启发式贪心策略提取映射结果并基于映射规则矫正结果。实验结果表明,与Falcon、ASMOV方法相比,该算法具有较高的查全率、查准率与F-measure值。
        Large-scale ontology mapping in the context of large data has high time complexity,low efficiency and accuracy.Therefore,a multi-strategy adaptive large-scale ontology mapping algorithm based on modularity and local confidence is proposed.Clustering and modularizing the inner part of the system,discovering the correlated sub-ontologies with high similarity between modules based on information retrieval strategy,calculating the local confidence under each mapping strategy among the correlated sub-ontologies,and adjusting the weight of the corresponding strategy adaptively based on the local confidence when combining the mapping results.On this basis,heuristic greedy strategy is used to extract mapping results and correct them based on mapping rules.Experimental results show that compared with Falcon and ASMOV methods,the proposed algorithm has higher recall,precision and F-measure value.
引文
[1] 王顺,康达周,江东宇.本体映射综述[J].计算机科学,2017,44(9):1-10.
    [2] SHVAIKO P,EUZENAT J .Ontology matching:state of the art and future challenges[J].IEEE Transactions on Knowledge and Data Engineering,2013,25(1):158-176.
    [3] DARAIO C,LENZERINI M,LEPORELLI C,et al.The advantages of an ontology-based data management approach:openness,interoperability and data quality[J].Scientometrics,2016,108(1):441-455.
    [4] 蒋湛,姚晓明,林兰芬.基于特征自适应的本体映射方法[J].浙江大学学报(工学版),2014,48(1):76-84.
    [5] ALANI H,SAAD S.Schema matching for large-scale data based on ontology clustering method[J].International Journal on Advanced Science:Engineering and Information Technology,2017,7(5):1790-1797.
    [6] 孙煜飞,马良荔,郭晓明,等.基于模块化的大规模本体映射方法[J].模式识别与人工智能,2016,29(5):410-416.
    [7] 孙煜飞,马良荔,周润芝.一种自适应的多策略本体映射方法[J].海军工程大学学报,2016,28(2):75-82.
    [8] YURUK N,METE M,XU X W,et al.AHSCAN:agglomerative hierarchical structural clustering algorithm for networks[C]//Proceedings of IEEE/ACM International Conference on Advances in Social Network Analysis and Mining.Washington D.C.,USA:IEEE Press,2009:72-77.
    [9] 杨月华,杜军平,平源.基于本体的智能信息检索系统[J].软件学报,2015,26(7):1675-1687.
    [10] 李文清,孙新,张常有,等.一种本体概念的语义相似度计算方法[J].自动化学报,2012,38(2):229-235.
    [11] SEDDIQUI M H,AONO M.An efficient and scalable algorithm for segmented alignment of ontologies of arbitrary size[J].Web Semantics:Science,Service and Agents on the World Wide Web,2009,7(4):344-356.
    [12] CALDAROLA E G,RINALDI A M.A multi-strategy approach for ontology reuse through matching and integration techniques[M].Berlin,Germany:Springer,2016.
    [13] FARIA D,PESQUITA C,SANTOS E,et al.The agreement maker light ontology matching system[EB/OL].[2018-08-25].http://disi.unitn.it/~p2p/RelatedWork/Matching/Feriae_AgreementMakerLight13.pdf.
    [14] HU W,QU Y Z.Falcon-AO:a practical ontology matching system[J].Web Semantics:Science,Services and Agents on the Word Wide Web,2008,6(3):237-239.
    [15] CHING C K,CHIEN S L.Ontology mapping and merging through OntoDNA for learning object reusability[J].Educational Technology and Society,2006,9(3):27-42.
    [16] JEAN Y R,KABUKA M R.ASMOV:results for OAEI 2008[EB/OL].[2018-08-25].http://www.dit.unitn.it/~p2p/OM-2008/oaei08paper.pdf.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700