Is relevance hard work? Evaluating the effort of making relevant assessments

Robert Villa, Martin Halvey

Research output: Chapter in Book/Report/Conference proceedingChapter (peer-reviewed)peer-review

21 Citations (Scopus)


The judging of relevance has been a subject of study in information retrieval for a long time, especially in the creation of relevance judgments for test collections. While the criteria by which assessors' judge relevance has been intensively studied, little work has investigated the process individual assessors go through to judge the relevance of a document. In this paper, we focus on the process by which relevance is judged, and in particular, the degree of effort a user must expend to judge relevance. By better understanding this effort in isolation, we may provide data which can be used to create better models of search. We present the results of an empirical evaluation of the effort users must exert to judge the relevance of document, investigating the effect of relevance level and document size. Results suggest that 'relevant' documents require more effort to judge when compared to highly relevant and not relevant documents, and that effort increases as document size increases.
Original languageEnglish
Title of host publicationProceedings of the 36th international ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR '13)
Place of PublicationNew York
Number of pages4
Publication statusPublished - 2013
Event36th international ACM SIGIR conference on Research and development in information retrieval - Dublin, Ireland
Duration: 28 Jul 20131 Aug 2013


Conference36th international ACM SIGIR conference on Research and development in information retrieval


  • information retrieval
  • relevance judgements
  • document relevance
  • information seeking behaviour

Cite this