Interviews

"We must increase the entire university community's contribution to knowledge production and visibility"

 Foto: FUOC

Foto: FUOC

25/10/2019
Rubn Permuy Iglesias
Isidro Aguillo, expert in scientific research assessment

 

How should we be assessing scientific output so it doesn't interfere with research itself? What role should universities and research centres and institutions be playing? And what about scientific publishers? We talk to Isidro F. Aguillo, head of the Cybermetrics Lab at the Institute of Public Goods and Policies (IPP), part of the Spanish National Research Council (CSIC). With a bachelor's degree in Biology, a master's degree in Information Science, a diploma in Advanced Studies, and an honorary doctorate from both the University of Indonesia and the National Research Nuclear University in Moscow, Aguillo is a renowned expert in cybermetrics, which involves measuring, studying and analysing all manner of data held on the internet. As a researcher, he is part of the team behind the scientific journal Cybermetrics, the first CISC peer-reviewed e-journal – in other words, applying one of the most internationally recognized validation methods for scientific content – as well as the Ranking Web of Universities, which measures the online presence and visibility of over 28,000 academic institutions around the world.  

 

What does your work in cybermetrics involve and what is the Lab's objective?

The Lab has two objectives. The first is purely scientific and hopes to promote a quantitative approach, which we believe to be more objective and less biased, towards the study of academic content and online scientific communication processes. We also hope to have a much greater social impact, focusing on promoting so-called open science, and how it can help in achieving the UN's Sustainable Development Goals.

Some of the articles you've published have criticized how much some universities and research centres spend on publishing scientific papers in certain journals. Can you tell us a bit more about this standpoint?

We fully support open access initiatives, which have already made a huge number of scientific papers available to readers all over the world. However, the pay-to-publish model now involves such an excessive expense, a situation brought about by some of the biggest names in publishing. They are able to generate an exorbitant profit by exploiting the individuals who write and review the papers they publish, which may well be dictated by market norms linked to the journals' prominence.

As a member of several expert committees for the European Commission's Directorate-General for Research and Innovation, you've shed light on the differences and shortfalls in classic research assessment models. Which would you say are most significant and what alternatives would you propose?

Our work was originally geared towards promoting greater diversity in the types of indicators used in these assessments, as well as incorporating web metrics. Now that open science requires new sources and indicators, our concern is with their rational, informed and responsible use. Regarding these indicators, we would always opt for relative rather than absolute indicators and insist upon contextualization, both in terms of production methods and visibility, impact or use with, for example, personal, institutional, disciplinary, temporal or economic determinants. These ideas are also reflected in the so-called Leiden Manifesto.

How should scientific output be assessed in the social sciences and humanities?

Citation analysis should continue to hold a central position in assessment processes, but it should also be complemented with more comprehensive and less biased sources. We have to be more rigorous in our efforts to meet publication standards and support this assessment with expert opinion. A mixed approach such as this, which is both quantitative and qualitative, is valid for all types of assessment, so long as we prioritize transparency when carrying out these processes and when justifying the results.

University teaching staff should be teaching as well as conducting research. There are national and international accreditation agencies – such as National Agency for Quality Assessment and Accreditation (ANECA) in Spain – who assess the quality and development of careers in science. Would a researcher's shining CV be enough to get them accredited, in other words to be officially recognized as a science professional?

In general, accreditation has been a good move, but unfortunately it has not done away with either nepotism or endogamy within our universities. However, excessive automation of processes, undeserved trust in poorly contextualized, quantitative criteria and a lack of transparency have limited its value in cases that should be more flexible. We could start to take steps towards solving this issue simply by conducting one-on-one interviews that are then placed in the public domain.

Do you think ANECA could improve the criteria and procedures they apply in Spain? Should there be accreditation agencies that evaluate research activity?

Assessment agencies are necessary, but we also need national scientific data systems that are exhaustive and accurate and provide assessment committees with trustworthy data. Both these systems should be created and overseen by professionals and they should have extensive and independent resources, but above all they need to be public; they need to prioritize the transparency and topicality of the data they hold. Then, once this quality data has been gathered, the assessment criteria and procedures can be designed in line with current demand and be redeveloped in an open context where users can interact freely with its content.

A debate is emerging within the scientific community about a journal ranking published by the Spanish Foundation for Science and Technology (FECYT), which aims to serve ANECA and other assessment agencies as a tool for improving accreditation systems that look at the quality of papers published by researchers from the social sciences and humanities. What's your view on the matter?

The use of journal metrics to assess individual output has been formally rejected by the international community as part of the San Francisco Declaration on Research Assessment (DORA). The formulation of the ranking was kept completely under wraps. The methodology was applied without a correct peer review.

The UOC is one of the many universities to sign the San Francisco Declaration on Research Assessment (DORA) in order to develop an scientific assessment model that takes quality into consideration. What do you think about DORA and what measures can universities adopt in order to adhere to its principles?

The declaration obliges all signatories to stop using the impact factor and all journal metrics in their assessment of individuals, groups or institutions. Unfortunately, the entire Spanish research and academic network is still using these indicators.

So, you also attended the UOC's Scientific Publishing seminar. In your opinion, what is the current condition of the Spanish scientific journal panorama?

Spain needs a select number of important multidisciplinary journals, not hundreds of journals with a certain level of formal and procedural quality along with minimal appeal.

You've been involved in launching the Ranking Web of Universities from around the world: what's it all about and what criteria does it use?

The point of these lists is neither to rank universities nor their websites, but rather to promote the open access publishing of knowledge generated by these institutions. The objective is to increase the amount of quality data that is so often lacking online. Our criteria value the quantity and quality of academic information online, the personal commitment, not just from the teaching staff but from the whole academic community, to make this information and the impact of its content accessible for society at large.

The UOC is ranked 38th in Spain and 895th in the world. What do you make of our positioning? What should we be thinking about changing if we want to improve it?

Our ranking position represents a huge recognition for open universities, which are often very much undervalued by other publishers. The key to improvement cannot be found in initiatives established by the higher echelons, rather it is essential that we increase contribution by all members of the community, increasing the online presence not just of teaching and research staff, but also of library, technical and administrative staff, and of course, of students. The institution's wealth and diversity has to be reflected in a public, online space.

General elections are once again around the corner. What advice would you give to the person taking up the position of Minister for Science, Innovation and Universities?

That question is a bit easier to answer: I would ask them to substantially increase funding so we can meet European standards!