Abstract
This paper analyzes the positions of institutions from the private domain in bibliometric rankings of as many as 27,000 research institutions and highlights factors that are crucial for a proper interpretation of such positions. It was found that among the institutions with the largest output in terms of published research articles, private firms are underrepresented, whereas in the top quartile of institutions with the largest citation impact firms are overrepresented. A firm’s publication output is not a good indicator of its R&D investment: big firms in Pharmaceutics are both heavy investors in R&D and frequent publishers of scientific articles, whereas in Automobiles firms tend to invest heavily in R&D but their publication output is low. This is ascribed to the fact that the former need a validation of their results by the scientific community, while the latter do less so. Private institutions generating the largest citation impact tend to collaborate with the best public research institutions. This reflects the crucial importance of publicly funded research for the private sector.
Similar content being viewed by others
References
AUBR (2010). Expert Group on the Assessment of University-Based Research. Assessing Europe’s University-Based Research. European Commission – DG Research. http://ec.europa.eu/research/era/docs/en/areas-of-actions-universities-assessing-europe-university-based-research-2010-en.pdf. Accessed 15 October 2012.
Bornmann, L., Moya-Anegón, F., & Leydesdorff, L. (2012). The new excellence indicator in the World Report of the SCImago Institutions Rankings 2011. Journal of Informetrics, 6(2), 333–335. doi:10.1016/j.joi.2011.11.006.
Calero-Medina, C., López-Illescas, C., Visser, M. S., & Moed, H. F. (2008). Important factors in the interpretation of bibliometric rankings of world universities. Research Evaluation, 17, 71–81.
Cherpa Network (2010). U-Multirank. Design and Testing the Feasibility of a Multi‐dimensional Global University Ranking. www.u-multirank.eu/project/UMR_IR_0110.pdf. Accessed 15 October 2012.
Cockburn, I. M., & Henderson, R. M. (1998). Absorptive capacity, coauthoring behavior and the organization of research in drug discovery. The Journal of Industrial Economics, 46, 157–182.
European Commission (2011). Institute for Prospective Technological Studies. Monitoring Industrial Research: The 2011 EU Industrial R&D Investment Scoreboard.
Godin, B. (1996). Research and the practice of publication in industries. Research Policy, 25(4), 587–606.
González-Pereira, B., Guerrero-Bote, V., & Moya-Anegón, F. (2010). A new approach to the metric of journal’s scientific prestige: The SJR indicator. Journal of Informetrics, 4(3), 379–391. doi:10.1016/j.joi.2010.03.002.
GRBS (2012). Global Research Benchmarking System. http://iist.unu.edu/projects/global-research-benchmarking-system. Accessed 15 October 2012.
Hazelkorn, E. (2011). Rankings and the reshaping of higher education: The battle for world-class excellence. Basingstoke: Palgrave Macmillan.
Hicks, D. (1995). Published papers, tacit competencies and corporate management of the public/private character of knowledge. Industrial and Corporate Change, 4, 401–424.
Kneller, R. (2010). The importance of new companies for drug discovery: origins of a decade of new drugs. Nature Reviews Drug Discovery, 9, 867–882.
Liu, N. C., & Cheng, Y. (2005). The academic ranking of world universities—methodologies and problems. Higher Education in Europe, 30(2), 127–136.
López-Illescas, C., de Moya-Anegón, F., & Moed, H. F. (2011). A ranking of universities should account for differences in their disciplinary specialization. Scientometrics, 88, 563–574.
Moed, H. F., de Moya-Anegón, F., López-Illescas, C., & Visser, M. (2011). Is concentration of university research associated with better research performance? Journal of Informetrics, 5, 649–658.
Narin, F., Hamilton, K. S., & Olivastro, D. (1997). The increasing linkage between U.S. technology and public science. Research Policy, 26, 220–317.
Narin, F., & Noma, E. (1985). Is technology becoming science? Scientometrics, 7(3–6), 369–381.
Polidoro, F., & Theeke, M. (2012). Getting competition down to a science: The effects of technological competition on firms’ scientific publication. Organization Science, 23(4), 1135–1153.
Rafols, I., Hopkins, M.M., Hoekman, J., Siepel, J., O’Hare, A., Perianes-Rodríguez, A., Nightingale, P. (2012). Big pharma, little science? A bibliometric perspective on big pharma’s R&D decline. Technological Forecasting and Social Change (in press).
SCImago (2013). SIR World Report 2012. http://www.scimagoir.com/. Accessed 15 October 2012.
Scopus (2013). What does it cover? http://info.scopus.com/scopus-in-detail/facts/. Accessed 15 October 2012.
Tijssen, R.W.J. (2004). Measuring and evaluating science–technology connections and interactions. In: Moed, H.F., Glänzel, W., and Schmoch, U. (Eds.), Handbook of quantitative science and technology research. The use of publication and patent statistics in studies of S&T systems. Dordrecht: Kluwer Academic Publishers, pp. 695–717.
Van Raan, A. F. J. (2005). Fatal attraction—conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics, 62(1), 133–143.
Acknowledgments
The research carried out by the second author was funded by the CSIC Program JAE-Doc, the European Social Fund and the Spanish Ministry of Science and Innovation JDC-MICINN Program. We would like to thank Fernando Galindo-Rueda from the EAS division of the OECD for the valuable R&D economic and financial information provided.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
de Moya-Anegón, F., López-Illescas, C. & Moed, H.F. How to interpret the position of private sector institutions in bibliometric rankings of research institutions. Scientometrics 98, 283–298 (2014). https://doi.org/10.1007/s11192-013-1087-4
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11192-013-1087-4