Producing efficient retrievability ranks of documents using normalized retrievability scoring function
详细信息    查看全文
  • 作者:Shariq Bashir (1)
    Akmal Saeed Khattak (2)
  • 关键词:Information systems evaluation ; Documents accessibility ; Documents findability ; Known ; items search ; Patent retrieval ; Recall ; oriented retrieval
  • 刊名:Journal of Intelligent Information Systems
  • 出版年:2014
  • 出版时间:June 2014
  • 年:2014
  • 卷:42
  • 期:3
  • 页码:457-484
  • 全文大小:
  • 参考文献:1. Arampatzis, A., Kamps, J., Kooken, M., Nussbaum, N. (2007). Access to legal documents: exact match, best match, and combinations. In / Proceedings of the 16th text retrieval conference (TREC-7).
    2. Azzopardi, L., & Bache, R. (2010). On the relationship between effectiveness and accessibility. In / SIGIR -0: Proceeding of the 33rd annual international ACM SIGIR conference on research and development in information retrieval, Geneva, Switzerland (pp.?889-90).
    3. Azzopardi, L., de?Rijke, M., Balog, K. (2007). Building simulated queries for known-item topics: an analysis using six European languages. In / SIGIR -7: Proceedings of the 30th annual international ACM SIGIR conference on research and development in information retrieval, Amsterdam, The?Netherlands (pp.?455-62).
    4. Azzopardi, L., & Vinay, V. (2008). Retrievability: an evaluation measure for higher order information access tasks. In / CIKM -8: Proceeding of the 17th ACM conference on information and knowledge management, Napa Valley, CA, USA (pp.?561-70).
    5. Bache, R., & Azzopardi, L. (2010). Improving access to large patent corpora. In / Transactions on large-scale data- and knowledge-centered systems II (Vol.?2, pp.?103-21). Springer.
    6. Baeza-Yates, R., & Ribeiro-Neto, B. (1999). / Modern information retrieval. ACM Press.
    7. Bashir, S., & Rauber, A. (2009a). Analyzing document retrievability in patent retrieval settings. In / DEXA-9: Proceedings of the 20th international conference on database and expert systems applications (pp.?753-60).
    8. Bashir, S., & Rauber, A. (2009b). Improving retrievability of patents with cluster-based pseudo-relevance feedback documents selection. In / Proceedings of the 18th ACM conference on information and knowledge management, CIKM 2009 (pp.?1863-866).
    9. Bashir, S., & Rauber, A. (2010a). Improving retrievability and recall by automatic corpus partitioning. In / Transactions on large-scale data- and knowledge-centered systems II (Vol.?2, pp.?122-40). Springer.
    10. Bashir, S., & Rauber, A. (2010b). Improving retrievability of patents in prior-art search. In / Advances in information retrieval, 32nd European Conference on IR Research, ECIR 2010 (pp.?457-70).
    11. Callan, J., & Connell, M. (2001). Query-based sampling of text databases. / ACM Transactions on Information Systems (TOIS) Journal, 19(2), 97-30. CrossRef
    12. Chowdhury, G.G. (2004). / Introduction to modern information retrieval (2nd?ed.). London: Facet Publishing.
    13. Gastwirth, J.L. (1972). The estimation of the Lorenz curve and Gini index. / The Review of Economics and Statistics, 54(3), 306-16. CrossRef
    14. Harter, P.S. & Hert, A.C. (1997). Evaluation of information retrieval systems: approaches, issues, and methods. / Annual Review of Information Science and Technology (ARIST), 32, 3-4.
    15. Lauw, W.H., Lim, E.-P., Wang, K. (2006). Bias and controversy: beyond the statistical deviation. In / Proceedings of the 12th ACM SIGKDD international conference on knowledge discovery and data mining, Philadelphia, PA, USA (pp.?625-30).
    16. Lawrence, S., & Giles, C.L. (1999). Accessibility of information on the web. / Nature, 400, 107-09.
    17. Lupu, M., Huang, J., Zhu, J., Tait, J. (2009). TREC-CHEM : large scale chemical information retrieval evaluation at trec. / SIGIR Forum, 43(2), 63-0. CrossRef
    18. Magdy, W., & Jones, J.F.G. (2010). Pres: a score metric for evaluating recall-oriented information retrieval applications. In / SIGIR-0: ACM SIGIR conference on research and development in information retrieval (pp.?611-18). ACM.
    19. Manning, D., Raghavan, C.P., Schutze, H. (2008). / Introduction to information retrieval. Cambridge: Cambridge University Press. CrossRef
    20. Mowshowitz, A., & Kawaguchi, A. (2002). Bias on the web. / Communications of the ACM, 45(9), 56-0. CrossRef
    21. Ounis, I., De?Rijke, M., Macdonald, C., Mishne, G., Soboroff, I. (2006). Overview of the trec 2006 blog track. In / Proc. of the text retrieval conference, TREC-6.
    22. Owens, C. (2009). / A study of the relative bias of web search engines toward news media providers. Master Thesis, University of Glasgow.
    23. Robertson, S.E., & Walker, S. (1994). Some simple effective approximations to the 2-poisson model for probabilistic weighted retrieval. In / SIGIR -4: Proceedings of the 17th annual international ACM SIGIR conference on research and development in information retrieval, Dublin, Ireland (pp.?232-41).
    24. Sanderson, M., & Zobel, J. (2005). Information retrieval system evaluation: effort, sensitivity, and reliability. In / SIGIR-5: ACM SIGIR conference on research and development in information retrieval (pp.?162-69). ACM.
    25. Singhal, A. (1997). At&t at trec-6. In / The 6th text retrieval conference (TREC6) (pp.?227-32).
    26. Singhal, A. (2001). Modern information retrieval: a brief overview. / IEEE Data Engineering Bulletin, 24, 34-3.
    27. Vaughan, L., & Thelwall, M. (2004). Search engine coverage bias: evidence and possible causes. / Information Processing and Management Journal, 40(4), 693-07. CrossRef
    28. Voorhees, M.E. (2001). Overview of the trec 2001 question answering track. In / Proc. of the text retrieval conference, TREC-1 (pp.?42-1).
    29. Voorhees, M.E. (2002). The philosophy of information retrieval evaluation. In / CLEF-1 (pp.?355-70). Springer.
    30. Voorhees, M.E., & Harman, K.D. (2005). / Trec experiment and evaluation in information retrieval. Cambridge, MA: MIT Press.
    31. Zhai, C. (2002). / Risk minimization and language modeling in text retrieval. PhD thesis, Carnegie Mellon University.
  • 作者单位:Shariq Bashir (1)
    Akmal Saeed Khattak (2)

    1. Center for Science and Engineering, New York University Abu Dhabi, Musaffah, Abu?Dhabi, United Arab Emirates
    2. Natural Language Processing Research Group, Department of Computer Science, University of Leipzig, Leipzig, Germany
  • ISSN:1573-7675
文摘
In this paper, we perform a number of experiments with large scale queries to analyze the retrieval bias of standard retrieval models. These experiments analyze how far different retrieval models differ in terms of retrieval bias that they imposed on the collection. Along with the retrieval bias analysis, we also exploit a limitation of standard retrievability scoring function and propose a normalized retrievability scoring function. Results of retrieval bias experiments show us that when a collection contains highly skewed distribution, then the standard retrievability calculation function does not take into account the differences in vocabulary richness across documents of collection. In such case, documents having large vocabulary produce many more queries and such documents thus have theoretically large probability of retrievability?via a much large number of queries. We thus propose a normalized retrievability scoring function that tries to mitigate this effect by normalizing the retrievability scores of documents relative to their total number of queries. This provides an unbiased representation of the retrieval bias that could occurred due to vocabulary differences between the documents of collection without automatically inflicting a penalty on the retrieval models that favor or disfavor long documents. Finally, in order to examine, which retrievability scoring function has better effectiveness than other for correctly producing the retrievability ranks of documents, we perform a comparison between the both functions on the basis of known-items search method. Experiments on known-items search show that normalized retrievability scoring function has better effectiveness than the standard retrievability scoring function.
NGLC 2004-2010.National Geological Library of China All Rights Reserved.
Add:29 Xueyuan Rd,Haidian District,Beijing,PRC. Mail Add: 8324 mailbox 100083
For exchange or info please contact us via email.