Citation indexes integrated management for Institutional Repositories data enrichment
Abstract
Purpose – An important problem for researchers and for agencies (e.g., Quality Assurance Units) that are responsible for evaluating the research activity of academic entities (e.g., laboratories, departments, entire institutions, etc.) is to locate and retrieve the bibliographic records (e.g., scientific papers) and their citations automatically from the various citation indexes.
Design/methodology/approach - To calculate uniform bibliometric indicators, the deduplication of the documents collected from the different citation indexes is required. In addition, such a tool could assist the academic libraries in upgrading their Research Repositories with auto-enrichment capabilities, saving valuable labour time from their staff.
Findings - In this context, the initial results of implementing such a tool for data extraction from the four popular citation indexes (Scopus, Google Scholar, Web of Science and PubMed) and the ORCID service are presented. The tool aims to provide integrated management of multiple citation indexes, namely the collection of data per researcher and the application of deduplication algorithms so that a list of unique publications is obtained for each one of them. The processed data are combined with the data of the Institutional Repository and converted into a suitable format for ingestion.
Originality/value - The Institutional Repository of the Cyprus University of Technology has been selected as a testbed. All universities can undoubtedly utilize the obtained results.
Keywords
References
McVeigh, Marie E. 2017. "Citation Indexes and the Web of Science." Encyclopedia of Library and Information Sciences. 4. Edition. Edited by John D. McDonald and Michael Levine-Clark. Boca Raton London New York: CRC Press, vol. 2: 940-50.
Moral-Muñoz, José A.; Herrera-Viedma, Enrique; Santisteban-Espejo, Antonio; Cobo, Manuel J. (2020). “Software tools for conducting bibliometric analysis in science: An up-to-date review”. El profesional de la información, v. 29, n. 1, e290103. https://doi.org/10.3145/epi.2020.ene.03
Moed, H. F., Bar-Ilan, J., & Halevi, G. (2016). A new methodology for comparing Google Scholar and Scopus. Journal of Informetrics, 10(2), 533–551. https://doi.org/10.1016/j.joi.2016.04.017
Michael Levine-Clark & Esther L. Gil (2021) A new comparative citation analysis: Google Scholar, Microsoft Academic, Scopus, and Web of Science, Journal of Business & Finance Librarianship, 26:1-2, 145-165 163, https://doi.org/10.1080/08963568.2021.1916724
Martín-Martín, Alberto, Enrique Orduna-Malea, Mike Thelwall and Emilio Delgado López-Cózar. 2018. “Google Scholar, Web of Science, and Scopus: A Systematic Comparison of Citations in 252 Subject Categories”. Journal of Informetrics 12, no. 4: 1160–77.
Gusenbauer, Michael. 2019. "Google Scholar to Overshadow Them All? Comparing the Sizes of 12 Academic Search Engines and Bibliographic Databases". Scientometrics 118, no. 1: 177–214. https://doi.org/10.1007/s11192-018-2958-5
Harzing, Anne-Wil, and Satu Alakangas. 2016. "Google Scholar, Scopus and the Web of Science: A Longitudinal and Cross-Disciplinary Comparison." Scientometrics 106, no. 2: 787-804. https://doi.org/10.1007/s11192-015-1798-9
Harzing, Anne-Wil. 2019. "Two New Kids on the Block: How do Crossref and Dimensions Compare with Google Scholar, Microsoft Academic, Scopus and the Web of Science?" Scientometrics 120, no. 1: 341-9. https://doi.org/10.1007/s11192-019-03114-y
Sugimoto, Cassidy R. and Vincent Larivière. 2018. Measuring Research: What Everyone Needs to Know. Oxford: Oxford University Press.
Navarro Gonzalo. 2001. A guided tour to approximate string matching. ACM Comput. Surv. 33, 1 (March 2001), 31–88. DOI: https://doi.org/10.1145/375360.375365
DOI: 10.26265/jiim.v6i1.4490
Refbacks
- There are currently no refbacks.