Inter‐rater reliability of h‐index scores calculated by Web of Science and Scopus for clinical epidemiology scientists
Health Information & Libraries Journal
Published online on May 11, 2016
Abstract
Objective
We investigated the inter‐rater reliability of Web of Science (WoS) and Scopus when calculating the h‐index of 25 senior scientists in the Clinical Epidemiology Program of the Ottawa Hospital Research Institute.
Materials and methods
Bibliometric information and the h‐indices for the subjects were computed by four raters using the automatic calculators in WoS and Scopus. Correlation and agreement between ratings was assessed using Spearman's correlation coefficient and a Bland–Altman plot, respectively.
Results
Data could not be gathered from Google Scholar due to feasibility constraints. The Spearman's rank correlation between the h‐index of scientists calculated with WoS was 0.81 (95% CI 0.72–0.92) and with Scopus was 0.95 (95% CI 0.92–0.99). The Bland–Altman plot showed no significant rater bias in WoS and Scopus; however, the agreement between ratings is higher in Scopus compared to WoS.
Conclusion
Our results showed a stronger relationship and increased agreement between raters when calculating the h‐index of a scientist using Scopus compared to WoS. The higher inter‐rater reliability and simple user interface used in Scopus may render it the more effective database when calculating the h‐index of senior scientists in epidemiology.