Abstract
This paper introduces a citation-based "systems approach" for analyzing the various institutional and cognitive dimensions of scientific excellence within national research systems. The methodology, covering several aggregate levels, focuses on the most highly cited research papers in the international journal literature. The distribution of these papers across institutions and disciplines enables objective comparisons their (possible) international-level scientific excellence. By way of example, we present key results from a recent series of analyses of the research system in the Netherlands in the mid 1990s, focussing on the performance of the universities across the various major scientific disciplines within the context of the entire system"s scientific performance. Special attention is paid to the contribution in the world"s top 1% and top 10% most highly cited research papers. The findings indicate that these high performance papers provide a useful analytical framework - both in terms of transparency, cognitive and institutional differentiation, as well as its scope for domestic and international comparisons - providing new indicators for identifying "world class" scientific excellence at the aggregate level. The average citation scores of these academic "Centres of Scientific Excellence" appear to be an inadequate predictor of their production of highly cited papers. However, further critical reflection and in-depth validation studies are needed to establish the true potential of this approach for science policy analyses and evaluation of research performance.
Similar content being viewed by others
References
Cole, S., J. Cole, Scientific output and recognition, American Sociological Review, (1967) 377–390.
Merton, R. K., The Sociology of Science, Chicago: University of Chicago Press, 1973.
Collins, H. M., Knowledge, norms, and rules in sociology of science, Social Studies of Science, 12, (1982) 299–309.
Pelz, D. C., F. M. Andrews (Eds), Scientists in Organizations: Productive Climates for Research and Development (Revised edition), Ann Arbor, Michigan: Institute for Social Research, The University of Michigan, 1976.
Horrobin, D., The philosophical basis of peer review and the suppression of on innovation, Journal of the American Medical Association, 263 (1990) 1438–1441.
Moxham, H., J. Anderson, Peer review: a view from the inside, Science and Technology Policy, (1992) 7–15.
Inhaber, H., K. Przednowek, Quality of research and the Nobel prizes, Social Studies of Science, 6, (1976) 33–50.
Garfield, E., Do Nobel Prize winners write citation classics, Current Contents, 23 (1986) 182.
Garfield, E., The 1991-Nobel prize winners were all citation superstars, Current Contents, 5 (1992) 3–9.
Narin, F., Evaluative Scientometrics: The Use of Publication and Citation Analysis in the Evaluation of Scientific Activity. Report US National Science Foundation, 1976.
Weinstock, N., Citation indexes, In: Kent, A. (Ed.). Encyclopedia of Library and Information Science, New York: Marcel Dekker, Vol. 5, 1971, pp. 16–41.
Edge, D. O., Quantitative measures of communication in science: A critical review, History of Science, 17 (1979) 102–134.
Macroberts M. H., B. R. Macroberts, Problems of citation analysis: a critical review, Journal of the American Society for Information Science, 40 (1989) 342–349.
Cozzens, S.E., What do citations count? The Rhetoric-first model, Scientometrics, 15 (1989) 437–447.
Luukonen, T., Why has Latour's theory of citations been ignored by the bibliometric community? Discussion of sociological interpretations of citation analysis, Scientometrics, 38 (1997) 27–37.
Leydesdorff, L., Theories of citation? Scientometrics, 43 (1998) 5–25.
Van Raan, A. F. J., The pandora's box of citation analysis: measuring scientific excellence — the last evil? In: Cronin, B., H. Barsky Atkins (Eds), The Web of Knowledge — A Festschrift in Honor of Eugene Garfield, Medford (NJ): ASIS Monograph Series, 2000, pp. 301–319.
Rinia E. J., T. N. Van Leeuwen, H. G. Van Vuuren, A. F. J. Van Raan, Comparative analysis of a set of scientometric indicators and central peer review criteria: evaluation of condensed matter physics in the Netherlands, Research Policy, 27 (1998) 95–107.
Citations reveal concentrated influence: some fields have it, but what does it mean?, Science Watch, 10 (1999) 1.
Seglen, P. O., The skewness of science, Journal of the American Society for Information Science, 43 (1992) 628–638.
Aksnes, D. W., G. Sivertsen, The effect of highly cited papers on national citation indicators, Proceedings of the 8th International Conference on Scientometrics and Informetrics, Sydney, Australia, July 2001, Sydney: University of New South Wales, 2001, pp. 23–30.
European Commission, Progress Report on Benchmarking of National Policies, Brussels: Commission staff working paper, SEC(2001) 1002, 2001.
Tijssen, R. J. W., Th. N. Van Leeuwen, H. Hollanders, B. Verspagen, Wetenschaps-en Technologie-Indicatoren 2000, Leiden/Maastricht: CWTS/MERIT, 2001.
Rip, A., B. Van der Meulen, Post-modern science system, Science and Public Policy, 23 (1996) 343–352.
Van der Meulen, B., A. Rip, Mediation in the Dutch science system, Research Policy, 27 (1998) 757–769.
Moed, H. F., T. N. Van Leeuwen, M. S. Visser, Trends in the publication output and impact of universities in the Netherlands, Research Evaluation, 8 (1999) 60–67.
Martin, B. R., The use of multiple indicators in the assessment of basic research, Scientometrics, 36 (1996) 343–362.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Tijssen, R.J.W., Visser, M.S. & van Leeuwen, T.N. Benchmarking international scientific excellence: Are highly cited research papers an appropriate frame of reference?. Scientometrics 54, 381–397 (2002). https://doi.org/10.1023/A:1016082432660
Issue Date:
DOI: https://doi.org/10.1023/A:1016082432660