Utilizing Locality-Sensitive Hash Learning for Cross-Media Retrieval
详细信息    查看全文
文摘
Cross-media retrieval is an imperative approach to handle the explosive growth of multimodal data on the web. However, existed approaches to cross-media retrieval are computationally expensive due to the curse of dimensionality. To efficiently retrieve in multimodal data, it is essential to reduce the proportion of irrelevant documents. In this paper, we propose a cross-media retrieval approach (FCMR) based on locality-sensitive hashing (LSH) and neural networks. Multimodal information is projected by LSH algorithm to cluster similar objects into the same hash bucket and dissimilar objects into different ones, using hash functions learned through neural networks. Once given a textual or visual query, it can be efficiently mapped to a hash bucket in which objects stored can be near neighbors of this query. Experimental results show that, in the set of the queries’ near neighbors obtained by the proposed method, the proportions of relevant documents can be much boosted, and it indicates that the retrieval based on near neighbors can be effectively conducted. Further evaluations on two public datasets demonstrate the effectiveness of the proposed retrieval method compared to the baselines.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700