Evaluation effort, reliability and reusability in XML retrieval
Sukomal Pal,
Mandar Mitra and
Jaap Kamps
Journal of the American Society for Information Science and Technology, 2011, vol. 62, issue 2, 375-394
Abstract:
The Initiative for the Evaluation of XML retrieval (INEX) provides a TREC‐like platform for evaluating content‐oriented XML retrieval systems. Since 2007, INEX has been using a set of precision‐recall based metrics for its ad hoc tasks. The authors investigate the reliability and robustness of these focused retrieval measures, and of the INEX pooling method. They explore four specific questions: How reliable are the metrics when assessments are incomplete, or when query sets are small? What is the minimum pool/query‐set size that can be used to reliably evaluate systems? Can the INEX collections be used to fairly evaluate “new” systems that did not participate in the pooling process? And, for a fixed amount of assessment effort, would this effort be better spent in thoroughly judging a few queries, or in judging many queries relatively superficially? The authors' findings validate properties of precision‐recall‐based metrics observed in document retrieval settings. Early precision measures are found to be more error‐prone and less stable under incomplete judgments and small topic‐set sizes. They also find that system rankings remain largely unaffected even when assessment effort is substantially (but systematically) reduced, and confirm that the INEX collections remain usable when evaluating nonparticipating systems. Finally, they observe that for a fixed amount of effort, judging shallow pools for many queries is better than judging deep pools for a smaller set of queries. However, when judging only a random sample of a pool, it is better to completely judge fewer topics than to partially judge many topics. This result confirms the effectiveness of pooling methods.
Date: 2011
References: Add references at CitEc
Citations:
Downloads: (external link)
https://doi.org/10.1002/asi.21403
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:bla:jamist:v:62:y:2011:i:2:p:375-394
Ordering information: This journal article can be ordered from
https://doi.org/10.1002/(ISSN)1532-2890
Access Statistics for this article
More articles in Journal of the American Society for Information Science and Technology from Association for Information Science & Technology
Bibliographic data for series maintained by Wiley Content Delivery ().