Evaluating institutional open access performance: Sensitivity analysis

Main Authors: Huang, Chun-Kai, Neylon, Cameron, Hosking, Richard, Montgomery, Lucy, Wilson, Katie, Ozaygen, Alkim, Brookes-Kenworthy, Chloe
Format: Article Journal
Terbitan: , 2020
Subjects:
Online Access: https://zenodo.org/record/3716067
Daftar Isi:
  • In the article “Evaluating institutional open access performance: Methodology, challenges and assessment” we develop the first comprehensive and reproducible workflow that integrates multiple bibliographic data sources for evaluating institutional open access (OA) performance. The major data sources include Web of Science, Scopus, Microsoft Academic, and Unpaywall. However, each of these databases continues to update, both actively and retrospectively. This implies the results produced by the proposed process are potentially sensitive to both the choice of data source and the versions of them used. In addition, there remain issues relating to the use of different definitions of the various open access categories, and selection bias in sample size and margin of error. The current work shows that the levels of sensitivity relating to the above issues can be significant at the institutional level. Hence, the transparency and clear documentation of the choices made on data sources (and their versions), definitions, and cut-off boundaries are vital for reproducibility and verifiability.
  • A companion white paper to the article "Evaluating institutional open access performance: Methodology, challenges and assessment"