[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3295750.3298978acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
abstract

Investigating the Effects of Popularity Data on Predictive Relevance Judgments in Academic Search Systems

Published: 08 March 2019 Publication History

Abstract

The elements of a surrogate serve as clues to relevance. They may be seen as operationalized relevance criteria by which users judge the relevance of a search result according to their information need. In addition to short textual summaries, today's academic search systems integrate additional data into their search results presentation, for example, the number of citations or the number of downloads. This kind of data can be described as popularity data, serving as factors also incorporated in search engines' ranking algorithms. Past research shows that there are diverse criteria and factors involved in relevance judgements from the user perspective. However, previous empirical studies on relevance criteria and clues examined surrogates that did not include popularity data. The goal of my doctoral research is to gain significant knowledge on the criteria by which users in an academic search situation make relevance judgements based on surrogates that include popularity data. This paper describes the current state of the experimental research design and method of data collection.

References

[1]
Balatsoukas, P. and Ruthven, I. 2012. An eye-tracking approach to the analysis of relevance judgments on the Web: The case of Google search engine. Journal of the American Society for Information Science and Technology. 63, 9 (Sep. 2012), 1728--1746.
[2]
Barry, C.L. 1998. Document representations and clues to document relevance. Journal of the American Society for Information Science. 49, 14 (Jan. 1998), 1293--1303.
[3]
Barry, C.L. and Schamber, L. 1998. Users' criteria for relevance evaluation: A cross-situational comparison. Information Processing & Management. 34, 2--3 (März 1998), 219--236.
[4]
Bateman, J. 1998. Changes in Relevance Criteria: A Longitudinal Study. Proceedings of the 61st ASIS Annual Meeting (1998), 23--32.
[5]
Behnert, C. and Lewandowski, D. 2015. Ranking search results in library information systems - Considering ranking approaches adapted from web search engines. The Journal of Academic Librarianship. 41, 6 (Nov. 2015), 725--735.
[6]
Beresi, U.C. et. al. 2010. Why did you pick that? Visualising relevance criteria in exploratory search. International Journal on Digital Libraries. 11, 2 (June 2010), 59--74.
[7]
Borlund, P. und Ingwersen, P. 1997. The development of a method for the evaluation of interactive information retrieval systems. Journal of Documentation. 53, 3 (Aug. 1997), 225--250.
[8]
Bruce, H.W. 1994. A cognitive view of the situational dynamism of user-centered relevance estimation. Journal of the American Society for Information Science. 45, 3 (Apr. 1994), 142--148.
[9]
Buckland, M.K. 2017. Information and society. MIT Press.
[10]
Cool, C. u. a. 1993. Characteristics of texts affecting relevance judgments. Proceedings of the 14th Annual National Online Meeting, New York, May 4-6, 1993 (1993), 77--84.
[11]
Howard, D.L. 1994. Pertinence as reflected in personal constructs. Journal of the American Society for Information Science. 45, 3 (1994), 172--185.
[12]
Kelly, D. 2009. Methods for evaluating interactive information retrieval systems with users. Foundations and Trends® in Information Retrieval. 3, 1-2 (2009).
[13]
Kelly, D. and Cresenzi, A. 2016. From design to analysis: Conducting controlled laboratory experiments with users. Proceedings of the 39th International ACM SIGIR Conference on Research and Development in Information Retrieval - SIGIR '16 (New York, New York, USA, 2016), 1207--1210.
[14]
Mizzaro, S. 1997. Relevance: The whole history. Journal of the American Society for Information Science. 48, 9 (Sep. 1997), 810--832.
[15]
Papaeconomou, C. et. al. 2008. Searchers' relevance judgments and criteria in evaluating web pages in a learning style perspective. Proceedings of the second international symposium on Information interaction in context - IIiX '08 (New York, New York, USA, 2008), 123--132.
[16]
Park, T.K. 1993. The nature of relevance in Information Retrieval: An empirical study. The Library Quarterly. 63, 3 (July 1993), 318--351.
[17]
Plassmeier, K. et. al. 2015. Evaluating popularity data for relevance ranking in library information systems. Proceedings of the 78th ASIS&T Annual Meeting (2015).
[18]
Rieh, S.Y. 2009. Credibility and cognitive authority of information. Encyclopedia of Library and Information Sciences. CRC Press. 1337--1344.
[19]
Rieh, S.Y. 2002. Judgment of information quality and cognitive authority in the Web. Journal of the American Society for Information Science and Technology. 53, 2 (Jan. 2002), 145--161.
[20]
Rieh, S.Y. and Belkin, N.J. 1998. Understanding judgment of information quality and cognitive authority in the WWW. Proceedings of the 61st ASIS Annual Meeting (1998), 279--289.
[21]
Saracevic, T. 2007. Relevance: A review of the literature and a framework for thinking on the notion in information science. Part III: Behavior and effects of relevance. Journal of the American Society for Information Science and Technology. 58, 13 (Nov. 2007), 2126--2144.
[22]
Saracevic, T. 2016. The Notion of relevance in information science: Everybody knows what relevance is. But, what is it really?. Morgan & Claypool.
[23]
Savolainen, R. and Kari, J. 2006. User-defined relevance criteria in web searching. Journal of Documentation. 62, 6 (Nov. 2006), 685--707.
[24]
Surowiecki, J. 2005. The wisdom of crowds. Anchor Books.
[25]
Tang, R. and Solomon, P. 1998. Toward an understanding of the dynamics of relevance judgment: An analysis of one person's search behavior. Information Processing & Management. 34, 2-3 (March 1998), 237--256.
[26]
Tang, R. and Solomon, P. 2001. Use of relevance criteria across stages of document evaluation: On the complementarity of experimental and naturalistic studies. Journal of the American Society for Information Science and Technology. 52, 8 (Jan. 2001), 676--685.
[27]
Taylor, A. 2013. Examination of work task and criteria choices for the relevance judgment process. Journal of Documentation. 69, 4 (July 2013), 523--544.
[28]
Taylor, A. 2012. User relevance criteria choices and the information search process. Information Processing & Management. 48, 1 (Jan. 2012), 136--153.
[29]
Wang, P. 1994. A cognitive model of document selection of real users of information retrieval systems. University of Maryland; College of Library and Information Science.
[30]
Wang, P. and Soergel, D. 1998. A cognitive model of document use during a research project. Study I. Document selection. Journal of the American Society for Information Science. 49, 2 (1998), 115--133.
[31]
Wilson, P. 1983. Second-hand knowledge: An inquiry into cognitive authority. Greenwood Press.

Cited By

View all
  • (2021)The Search Studies Group at Hamburg University of Applied SciencesDatenbank-Spektrum10.1007/s13222-021-00375-xOnline publication date: 22-Jun-2021
  • (2019)Information Retrieval in Food Science Research II: Accounting for Relevance When Evaluating Database PerformanceJournal of Food Science10.1111/1750-3841.1476984:10(2729-2735)Online publication date: 24-Sep-2019

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
CHIIR '19: Proceedings of the 2019 Conference on Human Information Interaction and Retrieval
March 2019
463 pages
ISBN:9781450360258
DOI:10.1145/3295750
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 March 2019

Check for updates

Author Tags

  1. experimental design
  2. predictive judgments
  3. relevance behavior

Qualifiers

  • Abstract

Conference

CHIIR '19
Sponsor:

Acceptance Rates

Overall Acceptance Rate 55 of 163 submissions, 34%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)6
  • Downloads (Last 6 weeks)0
Reflects downloads up to 14 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2021)The Search Studies Group at Hamburg University of Applied SciencesDatenbank-Spektrum10.1007/s13222-021-00375-xOnline publication date: 22-Jun-2021
  • (2019)Information Retrieval in Food Science Research II: Accounting for Relevance When Evaluating Database PerformanceJournal of Food Science10.1111/1750-3841.1476984:10(2729-2735)Online publication date: 24-Sep-2019

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media