Research Intelligence Services

Responsibly answering questions about research impact

Research Support | Research Intelligence

Research Intelligence Services

The Research Intelligence team provides support in responsibly answering questions about the (potential) academic and societal impact of research conducted at our university.

Impact

When we talk about impact, we make a distinction between academic impact and societal impact.

The first refers to the impact on fellow scientists and/or the research field in a broader sense. The second refers to the impact on a specific target group outside of academia. This impact can be direct but is usually more indirect through impact on policy, professional practice or industry.

In fact, you can only really speak of impact when research has brought about a change. Much of what we classify as the scientific or societal impact is actually potential impact, a step towards bringing about change.

The Coordination Point Research Impact, in which the UM Research Intelligence team is also represented, identifies four perspectives on impact:

  • Publish
    This is the usual step in which the scientist publishes research output. Through the publication process, the publication usually appears in important databases such as WoS or Scopus. This also includes all forms of Open Access publishing and the alternative channels scientists use to reach their audience.
  • Showcase
    These are the means that the researcher has at their disposal to showcase the publications and the research. This could be a personal website, the university website with the personal profile page, or listings in other places. By registering in a CRIS, the metadata of publications will also be visible in other places.
  • Promote
    This includes any additional activity to promote and draw attention to the scientific result.
  • Monitor/analysis
    Monitoring research output in various media and sources. These analyses are based on classic indicators and databases such as WoS and InCites and alternative metrics. By taking measurements at different time points, the effect of certain promotional activities can also be made visible.

The activities of the Research Intelligence team mainly relate to the fourth perspective. You can obtain insight into (the effect of) activities in the first three perspectives through monitoring and analysis. Through our workshops and information on the website, we also support researchers in applying the first three perspectives, and as such, contribute to creating the prerequisites for performing good analyses.

Responsible metrics

At Maastricht University Library, we are committed to the responsible use of metrics.

We believe that research intelligence should support, not replace, qualitative evaluations. Analyses should be based on a careful selection of indicators and methods tailored to the particular goals of the evaluation.

Research intelligence should be driven by questions, not indicators, in other words: measuring what you want to know, rather than measuring what we can measure. Excellence and performance are multidimensional concepts, and metrics can capture only some aspects of excellence and performance.

Various developments are ongoing regarding research evaluation and assessment: signage of the San Francisco Declaration on Research Assessment (DORA), Leiden Manifesto, Strategy Evaluation Protocol 2021-2027 (SEP) and Recognition & Rewards. What all these initiatives have in common is:

  1. Greater flexibility in the aspects on which both scientists and research units can be assessed;
  2. Use a narrative supported by responsible metrics instead of focusing on excellence (and thus competition) and metrics like the JIF or the H-index.

DORA

On October 25, 2019, Rianne Letschert signed the DORA declaration on behalf of UM.

Following DORA, the Research Intelligence team does not use the Journal Impact Factor (JIF) or other journal-based metrics to assess individuals or individual publications.

Furthermore, DORA states that consideration should be given to the value and impact of all research outputs (including datasets and software) in addition to research publications and that a broad range of impact measures should be considered, including qualitative indicators of research impact, such as influence on policy and practice.

For organizations that supply metrics, such as the Research Intelligence team, DORA states that they:

  • Be open and transparent by providing data and methods used to calculate all metrics.
  • Provide the data under a license that allows unrestricted reuse, and provide computational access to data, where possible.
  • Be clear that inappropriate manipulation of metrics will not be tolerated; be explicit about what constitutes inappropriate manipulation and what measures will be taken to combat this.
  • Account for the variation in article types (e.g., reviews versus research articles) and in different subject areas when metrics are used, aggregated or compared.

The Research Intelligence team guarantees these principles as follows:

  • The data and methods used are described in justification/method of reports;
  • Pure and InCites data are available, and use of the Dashboard guarantees reproducibility;
  • We check how analyses or figures we deliver are included in the final report and what purpose journal-based metrics are requested. Journal-based metrics are not provided for the assessment of individual scientists or individual articles.
  • Normalization takes into account the variation in article types. Furthermore, the Dashboard has the functionality to automatically include only reviews and articles for the InCites numbers to prevent strange outliers. There is also a filter for peer-reviewed publications on the Open Access tab in the Dashboard.

Leiden Manifesto

The Leiden Manifesto contains 10 principles that are intended as a guideline for research evaluation.

The following principles from this manifesto are relevant to our Research Intelligence services:

  • Quantitative evaluation supports qualitative assessment by experts;
  • Measure performance against the research missions of an institute, group or researcher;
  • Give those who are being evaluated the opportunity to verify data and analyzes;
  • Take into account variations between disciplines in publication and citation practices;
  • Base the assessment of individual scientists on a qualitative assessment of their portfolio (and not on their H-index);
  • Prevent misplaced concreteness and false precision;
  • Recognise the effects assessments and indicators have on the system;
  • Regularly review the indicators used and update them where necessary.

Why not the Journal Impact Factor?

The Journal Impact Factor (JIF) is frequently used as the primary parameter to compare the scientific output of individuals and institutions.

The Journal Impact Factor was created as a tool to help librarians identify journals to purchase, not as a measure of the scientific quality of research in an article. With that in mind, it is critical to understand that the Journal Impact Factor has several well-documented deficiencies as a tool for research assessment.

These limitations include:

  • Citation distributions within journals are highly skewed;
  • The properties of the Journal Impact Factor are field-specific and therefore not comparable between fields: it is a composite of multiple, highly diverse article types, including primary research papers and reviews;
  • Journal Impact Factors can be manipulated (or “gamed”) by the editorial policy;
  • Data used to calculate the Journal Impact Factors are neither transparent nor openly available to the public.

Source: https://sfdora.org/read

For more information on the limitations of the Journal Impact Factor as a tool for research assessment:

 

Why not the H-index?

CWTS: Halt the H-index - Infographic

CWTS, Halt the h-index. Download the original PDF.

Support for UM faculties/research groups and support staff

We provide our support in the context of a standard research evaluation (SEP) or on a question-driven basis.

While standard research evaluations are part of the university library’s general service for all UM faculties, charges may apply for other analyses, depending on the scope of the question. To find out more, please get in touch with us via research-i@maastrichtuniversity.nl.

If you are affiliated with FHML, please contact Cecile Nijland of the FHML office first.

Research evaluations (SEP)

The Research Intelligence team supports UM faculties or research units in preparing periodic research evaluations based on the Strategy Evaluation Protocol 2021-2027 [PDF].

Together with the unit under assessment, we decide on suitable indicators that fit the unit’s aims and strategy.

These indicators may allow for accounting of past performance and strategic decision-making for the future. Please get in touch with us well in advance for your unit’s SEP (midterm) evaluation.

UM Research Intelligence Dashboard

The Dashboard provides insights by linking the research output registered in Pure to data from sources such as InCites, Web of Science, Unpaywall and Altmetric.

The Dashboard offers the possibility to apply various filters directly to the available data so that you can easily adapt the arrangement of the data and the level of detail to your needs.

Access to and use of the Dashboard is on request. A dataset is prepared, and access is provided to the particular dataset in the Dashboard via authentication.

Other analyses and/or visualisations

The data and plots in the Dashboard can be regarded as our standard service, especially aimed at research evaluation for the Strategy Evaluation Protocol (SEP).

Your evaluation may require additional analyses and/or visualisations. Depending on the nature and scope of additional services, costs may be charged for this.

Alternative metrics

Altmetric is one of the platforms that gather alternative metrics, or for short ‘altmetrics’: a new type of metrics that aims to capture the societal impact of research output instead of traditional metrics such as citation counts.

As altmetrics are changeful and never fully captured by any database, we recommend refraining from focusing on the numbers. It can be a valuable source to identify who is building upon certain research work commercially and politically and to explore the general public’s sentiment.

Altmetric provides free data under certain restrictions. It also provides a paid platform called the Explorer for Institutions (EFI). This Altmetric Explorer allows the user to browse data by author, group, or department. As of July 2020, the UM has a license for the Altmetric Explorer, allowing us to make Altmetric data available on the ‘Social attention’ tab in the Dashboard. This Altmetric Explorer can be accessed by UM employees using their institutional e-mail address and password.

When using Altmetric data for evaluation purposes, be cautious as the quantitative indicators support qualitative, expert assessment and should not be used in isolation (see: Social media metrics for new research evaluation [PDF] and The Leiden manifesto for research metrics).

For any guidance on using Altmetric responsibly, please get in touch with the Research Intelligence Team via research-i@maastrichtuniversity.nl.

Examples of faculty and research group questions

  • How can non-bibliometric sources, such as news and policy documents be studied?
  • Which publications address topics relevant to one or more of the UN’s Sustainable Development Goals (SDGs)? 
  • What is the position of the research work to work in the subject area? 
  • Which scholars are potential collaborators (inside/outside UM)? 
  • Which journals are suitable to publish certain research in? 
  • How strong/productive is the collaboration with other research units? 
  • What academic and/or societal impact did the research have? 
  • How can potential reviewers for a paper/proposal be recommended? 
  • What are the frequently investigated topics by a certain research unit? 
  • What were the main accomplishments in terms of research impact over a certain period of time? 
  • Which opportunities to increase impact are potentially neglected? 
  • Who are the co-authors of a certain researcher? 
  • What is the position of a certain research unit in relation to comparable research units? 
  • What is the position of our work in relation to our strategy? 
  • What could be the most strategic choice in terms of funding application, based on the previous impact of our research? 
  • Which journals do the researchers of a certain research unit usually publish in? 
  • What could be a suitable publication strategy based on the vision and mission of the research unit?   
  • Which research areas are potentially neglected? 
  • What are inspiring best practices that have gained frequent attention?   
  • Which researcher fits a certain profile based on their research outputs?   
  • How does a certain candidate compare to other candidates in terms of network and/or impact? 

Support for individual UM researchers

The Research Intelligence team provides support in responsibly answering questions about the (potential) academic and societal impact of research conducted at our university.

Support services

There can be various reasons why as a researcher you want to gain insight into the (potential) impact of your work. For example when applying for a grant, when preparing for an evaluation meeting with your supervisor/manager, or to support a narrative about the impact of your work on your profile page or personal website.

The services of the Research Intelligence team can help you to gain this insight. To find out more, have a look at the items on this webpage or get in touch with us via research-i@maastrichtuniversity.nl.

If you are affiliated with FHML, please contact Cecile Nijland of the FHML office first.

 

Subject guides & workshops

We provide more detailed information on impact-related issues through subject guides and workshops.

Have a look at these guides or workshops if you want to learn more about research impact and what you can do to improve the (potential) impact of your research work.

Subject guides

Workshops

Toolkit Societal impact 

  • As one of the results of a strategic partnership between Springer Nature and The Association of Universities in the Netherlands (VSNU), a toolkit has been created to help you understand how other researchers view societal impact and how they have been successful in creating it.
    It is filled with plenty of advice and insights from researcher interviews, as well as further reading resources to help you find out more about societal impact and how to create it for your own research.

    Follow this link to the toolkit.

Alternative metrics

Altmetric is one of the platforms that gather alternative metrics, or for short ‘altmetrics’: a new type of metrics that aims to capture the societal impact of research output instead of traditional metrics such as citation counts.

As altmetrics are changeful and never fully captured by any database, we recommend refraining from focusing on the numbers. It can be a valuable source to identify who is building upon certain research work commercially and politically and to explore the general public’s sentiment.

Altmetric provides free data under certain restrictions. It also provides a paid platform called the Explorer for Institutions (EFI). This Altmetric Explorer allows the user to browse data by author, group, or department. As of July 2020, the UM has a license for the Altmetric Explorer. This Altmetric Explorer can be accessed by UM employees using their institutional e-mail address and password.

When using Altmetric data for evaluation purposes, be cautious as the quantitative indicators support qualitative, expert assessment and should not be used in isolation (see: Social media metrics for new research evaluation [PDF] and The Leiden manifesto for research metrics).

For any guidance on using Altmetric responsibly, please get in touch with the Research Intelligence Team via research-i@maastrichtuniversity.nl.

Examples of researcher questions

  • How can non-bibliometric sources, such as news and policy documents be studied? 
  • Which publications address topics relevant to one or more of the UN’s Sustainable Development Goals (SDGs)? 
  • What is the position of the research work to work in the subject area? 
  • Which scholars are potential collaborators (inside/outside UM)? 
  • Which journals are suitable to publish certain research in? 
  • How strong/productive is the collaboration with other research units? 
  • What academic and/or societal impact did the research have?
  • How can potential reviewers for a paper/proposal be recommended?

More information

Research intelligence and policy advise

For research intelligence-related topics not covered on this page, please contact the library’s Research Intelligence team via research-i@maastrichtuniversity.nl or use the contact form below.

Subscribe to our updates

Do you want to know more about research support by the library? If so, please sign up for one or more of our periodical updates.

Contact an expert

Would you please use the contact form below to contact our experts or to comment on this page?

Questions and comments

Page informationResearch Intelligence services and support UM

 

Pin It on Pinterest