Case Study: Library Research Metrics Service

This is the first of a series of qualitative case studies exploring the work and impact of Library Research Support activities and services. This case study focuses on the Library Research Metrics Service.

What we do

The Library Research Metrics Service provides support to individuals with research metrics queries, via training on a range of research metrics platforms, and education and outreach to ensure the university’s commitments to responsible use of research metrics are upheld. This is designed to complement support offered by the Department of Research, Enterprise and Innovation’s Research Information and Evaluation team which has a wider remit covering strategic research intelligence and support for large grant bids.

As well as an email enquiry service and web guidance, the Library Research Metrics Service provides training via online workshops, open to all academics and postgraduate researchers. These serve as an introduction to the concept of citation metrics and alternative metrics, what they can and cannot be used for, the principles of responsible metrics, and the importance of data accuracy – including how this may be improved through the use of ORCID researcher identifiers. Sessions also include live demonstrations on the tool, platform, or process of attendees’ choice: for example, how to create bespoke reports in SciVal, how to find alternative metrics in Scopus or Altmetric Explorer, or how to clean up author profiles in Scopus and other bibliographic databases.

Outreach activities are a key part of the support service; currently the ORCID promotion campaign is the main focus for outreach activities. This campaign seeks to increase ORCID signup rates among research staff and PGRs, which with support from Faculty Research Directors will be achieved in a variety of ways:

  1. Direct communication with the small subset of researchers that have an ORCID but have not fully synchronised it to their Pure profile
  2. Talks at School assemblies and other relevant gatherings
  3. PGR-led promotion activities
  4. Passive communication via posters and banners in key locations
  5. Active encouragement via a prize draw for new ORCID signups
Enquiry types

The email enquiry service receives a range of enquiry types: primarily these relate to 1) use of specific metrics platforms, 2) requests for metric support for grant for promotion bids, 3) queries about the use of metrics to support decisions on journal choice. Often a large part of the response to these enquiries is educational rather than direct provision of the resources requested. For example, both DORA and the University’s own statement on Responsible Research Evaluation state that research outputs must be considered on their own merits rather than the reputation or ranking of the journal or publisher. Therefore, a significant part of enquiry work is responding sensitively to researchers with these types of queries, to explain why metrics may not necessarily be helpful in making these decisions and to signpost to alternative tools and methods for journal selection. There are some instances where specific metrics can be useful: for example, establishing proportions of article types published in a given journal to identify titles most likely to be receptive to submission of similar manuscripts. In these instances, the Library Research Metrics Service will demonstrate how these metrics can be obtained or provide bespoke reports.

Another common query category comes from researchers who are finding unexpected results when seeking metrics data on their own publications: typically, missing publications or missing citations. Support in these instances usually takes two formats: 1) an investigation into and explanation of any data inaccuracies and suggestions for how these may be addressed, and 2) education on the limitations of metrics platforms – which is particularly relevant for researchers working in disciplines that are not covered well by the main bibliometrics platforms (arts, humanities, and those working in languages other than English, to name a few).

Outcomes and next steps

Responses to these education and outreach activities have largely been positive, with researchers praising the service for providing “really helpful” information. Certain departments or units are frequent flyers to the service – for example ALSPAC – but generally users tend to have a single query only. It remains to be seen whether the raised profile of the Library Metrics Service provided by the ORCID promotion campaign will result in larger volume of enquiries. In future, workshops will be run in person as well, and online workshops will be provided asynchronously to enable wider uptake.

Library launches new researcher metrics support service

Why metrics support?

Research metrics or indicators are quantitative measures designed to evaluate research outputs.  The term encompasses citation metrics, also known as bibliometrics, which are based on traditional scholarly citations, and ‘alternative’ metrics based on attention in social media, news, policy documents, code repositories and other online sources.  These metrics are increasingly being used to benchmark research performance and provide an indication of research impact in funding applications, by promotion and progression boards, and feed into university league table rankings and REF2021 assessments.

It’s attractive to think that the complexities of evaluating one piece of research against another could be simplified by using metrics, but these indicators have serious limitations that must be acknowledged if they are to be used effectively.  Metrics are significantly affected by differences in citation patterns across disciplines, sub-specialities, and researcher career stage, and can be subject to ‘gaming’ – deliberate inflation of citation counts.  As a result, qualitative review must always be used alongside a range of indicators to give a true picture of research impact.

“Carefully selected indicators can complement decision-making, but a ‘variable geometry’ of expert judgement, quantitative indicators and qualitative measures that respect research diversity will be required.”

Wilsdon, J., et al. (2015). The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management. DOI: 10.13140/RG.2.1.4929.1363

With this in mind, the library’s Research Support team has launched a researcher metrics support service to help researchers access accurate metrics data, and select and interpret appropriate indicators.  The scope of the service was determined in consultation with Research and Enterprise Development (RED); library support will focus on individual researchers, whereas RED will retain support for strategic bids and projects requiring metrics information.

Service priorities

Useful research metrics are dependent on the quality of the source data: accurately attributed publications.  A key task for the metrics service will be to help researchers correct attribution information for their publications.  The University subscribes to SciVal for access to research management information based on Scopus citation data, so initially this support will focus on Scopus author profiles, although other systems will be added later if there is demand.  Additionally, we promote the use of ORCiD researcher identifiers to easily link author profiles in different systems, including Scopus and the University’s current research information system, Pure.

Other library support offered to researchers includes:

      • web guidance
      • workshops (in development)
      • SciVal deskside training
      • enquiry service

The online guidance covers a range of topics, including an overview of important indicators and where they can be accessed, suggested use cases for metrics, an introduction to different tools available to access and analyse indicators, and signposts to the support available from RED and other departments.

Access our guidance at bris.ac.uk/staff/researchers/metrics/ or email lib-metrics@bristol.ac.uk for support.