Abstract
The characteristics of the production and dissemination of scientific activity in social sciences and humanities make such endeavours less visible than other areas in international databases. The same circumstances also limit the evaluation of journals, researchers and institutions and complicate the quality-based comparison of scientific-scholarly journals. This study presents a methodology developed in order to classify the Spanish journals of social science and humanities. With the aim of considering national journals in research activity evaluation processes, the Spanish Foundation for Science and Technology (FECYT) has sponsored and coordinated the development of a classification that allows journals that have obtained the FECYT Quality Seal to be ranked. For this purpose, a model with two dimensions (impact and visibility) based on quantitative criteria has been carried out. To obtain indicators in each of these dimensions, databases such as the Science Citation Index, the Social Science Citation Index, the Arts & Humanities Citation Index and the Emerging Sources Citation Index from the Web of Science Core Collection, Scopus, SciELO, Google Metrics and the Information Matrix for the Analysis of Journals are used as sources of information. Spanish evaluation agencies have recently announced the implementation of this evaluation methodology in various disciplines.
Similar content being viewed by others
References
Adams, J. (2005). Early citation counts correlate with accumulated impact. Scientometrics, 63(3), 567–581.
Ahlgren, P., Colliander, C., & Persson, O. (2012). Field normalized citation rates, field normalized journal impact and Norwegian weights for allocation of university research funds. Scientometrics, 92, 767–780.
Aleixandre-Benavent, R., Peruginelli, G., De Filippo, D., & Sanz Casado, E. (2019). International visibility and impact of national journals. A comparative study of Spanish and Italian legal journals. In: Third research evaluation in the SSH conference—RESSH. Valencia, September 19–20. https://ressh2019.webs.upv.es/wp-content/uploads/2019/10/ressh_2019_paper_17.pdf.
Aleixandre-Benavent, R., Sanz-Casado, E., De Filippo, D., & Castelló-Cogollos, L. (2018). Políticas nacionales de evaluación de la actividad científica y su impacto sobre la edición de revistas. In 8ª Conferencia internacional sobre revistas científica. Barranquilla (Colombia), May 2–4.
Alfonso, J., Rodriguez-Morales, A. J., & Mayta-Tristán, P. (2009). Preliminary bibliometric evaluation of scientific publications produced in Latin America in the field of tropical and infectious diseases using SciELO. Journal of Infection in Developing Countries, 3(3), 247–249.
Archambault, E., Vignola-Gagne, E., Cote, G., Larivière, V., & Gingras, Y. (2006). Benchmarking scientific output in the social sciences and humanities: The limits of existing databases. Scientometrics, 68(3), 329–342.
Bornmann, L., & Leydesdorff, L. (2013). The validation of (advanced) bibliometric indicators through peer assessments: A comparative study using data from InCites and F1000. Journal of Informetrics, 7, 286–291.
Braun, T., Glänzel, W., & Schubert, A. (2006). A Hirsch-type index for journals. Scientometrics, 69(1), 169–173.
Campanario, J. M. (2011). Empirical study of journal impact factors obtained using the classical two-year citation window versus a five-year citation window. Scientometrics, 87, 189–204.
Campbell, K., Goodacre, A., & Little, G. (2006). Ranking of United Kingdom law journals: An analysis of the research assessment exercise 2001 submissions and results. Journal of Law and Society, 33, 335–363.
Chi, P. S. (2014). Which role do non-source items play in the social sciences? A case study in political science in Germany. Scientometrics, 101(2), 1195–1213.
De Filippo, D., Aleixandre-Benavent, R., & Sanz-Casado, E. (2019). Categorization model of Spanish scientific journals in social sciences and humanities. In G. Catalnao, C. Daraio, M. Gregori, H. F. Moed, & G. Ruocco (Eds.), Proceedings of the 17th international conference of the International Society for Scientometrics and Informetrics (Vol. 1, pp. 726–737). Rome: Edizioni Efesto.
Dorta-González, P., & Dorta-González, M. I. (2013). Comparing journals from different fields of science and social science through a JCR subject categories normalized impact factor. Scientometrics, 95, 645–672.
Elkins, M. R., Maher, C. G., Herbert, R. D., Moseley, A. M., & Sherrington, C. (2010). Correlation between the journal impact factor and three other journal citation indices. Scientometrics, 85, 81–93.
FECYT. (2018). Proyecto ARCE. Accesible en: https://www.fecyt.es/es/recurso/arce.
FECYT. (2020). Recursos científicos para investigadores. https://www.recursoscientificos.fecyt.es/servicios/informacion.
Ferrara, A., & Bonaccorsi, A. (2016). How robust is journal rating in Humanities and Social Sciences? Evidence from a large-scale, multi-method exercise. Research Evaluation, 25(3), 279–291.
Giménez-Toledo, E., Román-Román, A., & Alcaín-Partearroyo, D. (2007). From experimentation to coordination in the evaluation of Spanish scientific journals in the humanities and social sciences. Research Evaluation, 16(2), 137–148.
Glänzel, W., & Schoepflin, U. (1999). A bibliometric study of reference literature in the sciences and social sciences. Information Processing and Management, 35, 31–44.
Glanzel, W., & Thijs, B. (2004). World flash on basic research—The influence of author self-citations on bibliometric macro indicators. Scientometrics, 59(3), 281–310.
Hammarfelt, B., & De Rijcke, S. (2015). Accountability in context: Effects of research evaluation systems on publication practices, disciplinary norms, and individual working routines in the Faculty of Arts at Uppsala University. Research Evaluation, 24(1), 63–77.
Hellqvist, B. (2010). Referencing in the humanities and its implications for citation analysis. Journal of the American Society for Information Science and Technology, 61(2), 310–318.
Hicks, D. (2004). The four literatures of social science. In H. Moed (Ed.), Handbook of quantitative science and technology research (pp. 473–496). Dordrecht: Kluwer Academic.
Hicks, D., & Wang, J. (2011). Coverage and overlap of the new social science and humanities journal lists. Journal of the American Society for Information Science and Technology, 62(2), 284–294.
Hicks, D., Wouters, P., Waltman, L., Rijcke, S., & Rafols, I. (2015). The Leiden Manifesto for research metrics. Nature, 520, 429–431.
Hyland, K. (2003). Self-citation and self-reference: Credibility and promotion in academic publication. Journal of the American Society for Information Science and Technology, 54(3), 251–259.
Ingwersen, P., & Larsen, B. (2014). Influence of a performance indicator on Danish research production and citation impact 2000–12. Scientometrics, 101(2), 1325–1344.
Jacsó, P. (2010). Comparison of journal impact rankings in the Scimago Journal & Country Rank and the Journal Citation Reports databases. Online Information Review, 34(4), 642–657.
Jarwal, S. D., Brion, A. M., & King, M. L. (2009). Measuring research quality using the Journal Impact Factor, citations and ranked journals. Blunt instruments or inspired metrics? Journal of Higher Education Policy and Management, 31(4), 289–300.
Linmans, A. J. M. (2010). Why with bibliometrics the Humanities does not need to be the weakest link indicators for research evaluation based on citations, library holdings, and productivity measures. Scientometrics, 83, 337–354.
Mañana Rodriguez, J. (2014). A critical review of Scimago Journal & Country Rank. Research Evaluation, 24(4), 343–354.
MIAR. Matriz de Información para el Análisis de Revistas. (2018). Sobre el ICDS. http://miar.ub.edu/about-icds.
Michels, C., & Schmoch, U. (2012). The growth of science and database coverage. Scientometrics, 93(3), 831–846.
Moed, H. (2004). Handbook of quantitative science and technology research. Dordrecht: Kluwer Academic.
Moed, H., de Moya-Anegon, F., Guerrero-Bote, V., & Lopez-Illescas, C. (2020). Are nationally oriented journals indexed in Scopus becoming more international? The effect of publication language and access modality. Journal of Informetrics, 14, 101011.
Moed, H. F., Luwel, M., & Nederhof, A. J. (2002). Towards research performance in the humanities. Library Trends, 50, 498–520.
Moosa, I. A. (2016). A critique of the bucket classification of journals: The ABDC list as an example. Economic Record, 92, 448–463.
Nederhof, A. J. (2006). Bibliometric monitoring of research performance in the social sciences and the humanities: A review. Scientometrics, 66, 81–100.
Nederhof, A. J., & Noyons, E. C. M. (1992). International comparison of departments’ research performance in the humanities. Journal of the American Society for Information Science, 43(3), 249–256.
Nederhof, A. J., & Zwaan, R. A. (1991). Quality judgments of journals as indicators of research performance in the humanities and the social and behavioral sciences. Journal of the American Society for Information Science, 42(5), 332–340.
Norris, M., & Oppenheim, C. (2010). The H-index: A broad review of a new bibliometric indicator. Journal of Documentation, 66(5), 681–705.
Ochsner, M., Hug, S., & Galleron, I. (2017). The future of research assessment in the humanities: Bottom-up assessment procedures. Palgrave Communications, 3, Article number: 17020. https://doi.org/10.1057/palcomms.2017.20.
Ortega Cuevas, S., Márquez Rangel, S., & López Huerta, L. (2013). Superposición entre las citas del Web of Science y Scopus: un estudio exploratorio. Ibersid, 7, 131–135.
San Francisco Declaration on Research Assessment. (2012). Retrieved on July 27, 2020, from https://sfdora.org.
Sanz-Casado, E. (2017). Las Humanidades y Ciencias Sociales en la Universidad: nuevas propuestas de evaluación. In Summer School of UIMP “La evaluación en Humanidades y Ciencias Sociales: retos y paradojas” Santander, July 10–14.
Sanz-Casado, E. (2018). Nuevos criterios de valoración en Humanidades y Ciencias Sociales. In Jornadas sobre “Evaluación y visibilidad de la investigación en humanidades y ciencias sociales” Universidad Católica de Valencia, Valencia, October 18–19.
Sanz-Casado, E., De Filippo, D., & Aleixandre-Benavent, R. (2017a). Guía metodológica de clasificación de revistas de Ciencias Sociales y Humanidades. FECYT, Madrid. https://www.fecyt.es/es/noticia/guia-metodologica-de-clasificacion-de-revistas-de-ciencias-sociales-y-humanidades.
Sanz-Casado, E., De Filippo, D., & Alexandre-Benavent, R. (2017b). Classification model of Spanish scientific journals in social sciences and humanities. In 22nd Nordic workshop on bibliometrics and research policy, Helsinski (Finlandia), November 9–10.
Scimago Journal & Country Rank. (2020). https://www.scimagojr.com/.
Snyder, H., & Bonzi, S. (1998). Patterns of self-citation across disciplines (1980–1989). Journal of Information Science, 24(6), 431–435.
Torres-Salinas, D., Bordons, M., Giménez-Toledo, E., Delgado-López-Cózar, E., Jiménez-Contreras, E., & Sanz-Casado, E. (2010). Clasificación integrada de revistas científicas (CIRC): propuesta de categorización de las revistas en ciencias sociales y humanas. El Profesional de la Información, 19(6), 675–684.
Traag, V. A., & Waltman, L. (2019). Systematic analysis of agreement between metrics and peer review in the UK REF. Palgrave Communications, 5, Art. No. 29.
Tsay, M. Y. (2009). The relationship between journal self-citation and other scientometric data for some subjects of the social sciences. In B. Larsen, & J. Leta (Eds.), Proceedings of ISSI 2009-12th international conference of the International Society for Scientometrics and Informetrics, (Vol. 1, pp. 472–481).
Vanclay, J. K. (2012). Impact factor: Outdated artefact or stepping-stone to journal certification? Scientometrics, 92(2), 211–238.
Vasen, F., & Lujano Vilchis, I. (2017). Sistemas nacionales de clasificación de revistas científicas en América Latina: tendencias recientes e implicaciones para la evaluación académica en ciencias sociales. Revista Mexicana de Ciencias Políticas y Sociales, 62, 199–228.
Waltman, L., & Van Eck, N. J. (2012). A new methodology for constructing a publication-level classification system of science. Journal of the American Society for Information Science and Technology, 63(12), 2378–2392.
Waltman, L., van Eck, N. J., van Leeuwen, T. N., Visser, M. S., & van Raan, A. F. J. (2011). On the correlation between bibliometric indicators and peer review: reply to Opthof and Leydesdorff. Scientometrics, 88, 1017–1022.
Acknowledgements
This study was sponsored by the Spanish Foundation for Science and Technology (FECYT), which carries out an ongoing evaluation of journals and awards the FECYT Quality Seal. The authors would like to thank FECYT officers Cristina González Copeiro, Pilar Rico and María Ángeles Coslado. The authors would also like to thank Dr. Evaristo Jiménez Contreras, Lluis Codina and especially Remedios Melero for their review and contributions to the methodology.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
De Filippo, D., Aleixandre-Benavent, R. & Sanz-Casado, E. Toward a classification of Spanish scholarly journals in social sciences and humanities considering their impact and visibility. Scientometrics 125, 1709–1732 (2020). https://doi.org/10.1007/s11192-020-03665-5
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11192-020-03665-5