Research impact

Libraries experts help researchers measure, maximize, and understand the impact of their work.

About research impact

Subject librarians help researchers

  • find and calculate individual, group, journal, and external impact measures,
  • get guidance on the strengths and weaknesses of the data,
  • gain context for using the tools, and
  • build an understanding of what the numbers actually report.

The favored trend employs measures based on citations because they are relatively easy-to-gather, objective data that may indicate a publication's contribution to further research. Deeper investigation reveals complexities in how these measures are calculated and the difficulties of comparing across disciplines that have different research and publication practices.

The ramifications of research may be diverse, wide-ranging, and long-term, and therefore intrinsically hard to measure. The current, competing measures of research impact highlighted in these pages are understandably imperfect. Care should be taken in understanding their merits and limitations.

Contact us

Individual impact measures

How does a researcher gauge the impact of their research? How do administrators objectively evaluate the performance of a researcher? Both questions are difficult to answer, but there are numeric measures that offer a rough snapshot of impact.

Citation count

The number of citations to articles and books provides a measurable indicator of research impact. It suggests how much a work is being used to advance the research of others.

Note: raw citation counts vary depending on the data source, and it is hard to compare researchers in different fields and at different career stages. Examples of databases with citation counts are:

  • Web of Science, a large interdisciplinary citation database,
  • Scopus, which, like Web of Science, searches a slightly different group of citations and provides another sense of impact, and   
  • Google Scholar, a free database, which counts citations from both scholarly and non-scholarly sources.

Some subject-specific databases now include citation counts, including SciFinder Scholar for chemistry and MathSciNet for mathematics. Check with your subject librarian to find similar databases in your field. 

Article download count

Some platforms provide download counts for individual publications.

Examples of platforms that provide download counts include

Because of non-uniform cirteria, there is no standard way to aggregate counts from different systems. 

h-index

An h-index measures productivity as well as impact by counting how many of an author’s papers have been cited how many times. For example: to have an h-index of 5, five of a scholar’s publications must have been cited by others at least five times each.

While more sophisticated than plain citation counts, the h-index shares the limitations of incomparability across fields and across career stages.

An individual’s h-index may be found in:

Groups and departments

Groups, whether a research group, department, or institution, may wish to gauge the impact of their research or learn how it compares to its peers. Similar to individual impact measures, these numbers can give only a partial story of impact.

Publication activity and citation count

Citation counts can serve as a simple measure of activity and impact for a group or department. 

Search the names of all individuals in the group by combining their names with the OR search operator.  Raw citation counts will vary depending on the data source as they will report the impact of only those items indexed in their database.  Number of articles or citation counts can be found in:

  • Scopus: Scopus indexes more than 50,000 publications. Individual researchers can find their profiles using the "Author Search" feature. Scopus also allows you to remove self-citations.
  • Web of Knowledge: Web of Science indexes more than 20,000 journals, plus books and conference proceedings. Despite its name, it indexes arts and humanities materials from 1975 to the present. Web of Science allows you to limit by institution to reduce the number of false matches, and allows you to remove self-citations from citation counts.
    • The Libraries subscribe to Thomson Reuters’ InCites Essential Science Indicators, which can be used to identify emerging trends in scientific research.  It is based on Web of Science citation counts and provides the ability identify highly cited researchers, journals, papers, and countries in 22 fields of research.
  • Google Scholar is not recommended to calculate these impact measures because the search cannot be limited by institution. It is, therefore, likely that the count would include false matches.

Group h-index

This measure aims to capture productivity as well as impact by counting how many of an author’s papers have been cited many times; to have an h-index of 5, five of a scholar’s publications must have been cited by others at least five times each.

While more sophisticated than plain citation counts, the h-index shares the limitations of incomparability across fields and across career stages. The h-index for a group takes all the publications of every member of the group and creates a cumulative score. H-index can be calculated using:

  • Scopus: Data can be retrieved by searching for all individuals in a group separated by the OR search operator. Total publications and citations will be available, but h-index for the group will not automatically be calculated. 
  • Web of Knowledge: Calculates the h-index using only citations indexed in Web of Knowledge. A similar process to calculating an h-index for an individual, but instead of searching just one name, include all the individuals in your group separated by the OR search operator.
  • Google Scholar is not recommended to calculate these impact measures because the search cannot be limited by institution. It is, therefore, likely that the count would include false matches.
Institutional rankings

A number of organizations publish rankings of institutions and research programs. In ranking an institution, a level of ambiguity must be embraced because many different metrics are used to make comparisons. A variety of ranking models are created by various organizations for their own purposes.

As each ranking model incorporates different variables and weights them differently, no single ranking methodology is appropriate for all situations.

Examples of published rankings include:

The US News & World Report Rankings is a well-known ranking of universities by discipline, but actual research measures are not part of their ranking criteria.

Berlin Principles on Ranking of Higher Education Institutions

UNESCO European Centre for Higher Education (UNESCO-CEPES) and the Institute for Higher Education Policy founded the International Ranking Expert Group (IREG) was founded in 2004 by the in Bucharest and the Institute for Higher Education Policy in Washington, DC.  In 2006, the IREG proposed a set of 16 principles for ranking of higher education institutions (the ‘Berlin Principles’).  This was followed with a manual for auditing rankings.

To date, QS World University Rankings is the only international ranking that has been undergone and passed an audit based on these principles.

University of Minnesota Provost office reports

At the University of Minnesota, the Academic Affairs and Provost's office reports a number of ranking tools. A variety of data is presented to help users to determine the University’s ranking in a number of areas.

Journal impact measures

Some journals are read and cited more than others. Measuring the impact of a journal may help authors identify where to publish.

Journal impact measures vary across disciplines and cannot be directly compared.

Journal Impact Factor

A journal’s Impact Factor averages the number of recent citations to the journal by the number of articles it recently published.

Journal Impact Factor considers only a brief period of time and can change from year to year. Impact Factor is based on Web of Knowledge data and can be viewed in either of ISI’s two data sources.

  • Web of Knowledge: Search by journal title. In an article record, click Journal Citation Reports towards the bottom of the record, under "Journal Information."
  • Journal Citation Reports: Compare journals in a subject discipline: view a journal’s Impact Factor with others in its discipline.

SCImago Journal Rank (SJR)

The SJR is a free website based on the citation data tracked in Elsevier’s Scopus database. The ranking system works like the Google PageRank algorithm as it incorporates citation data as well as relationships among journals (via citations).

Eigenfactor Score

A journal’s Eigenfactor Score counts the citations made to a journal over time, but gives more weight to the citations from highly ranked journals than lower ranked ones.

Eigenfactor ranks the overall impact of a journal, and not the impact of articles within that journal.

CiteScore

A journal's CiteScore is the total number of citations in a year to articles published in the three previous years, divided by the total number of articles published in those three years.

CiteScore is limited to only scholarly articles, conference papers and review articles and does not consider citations from trade publications, newspapers, or books.

CiteScore is similar to the Impact Factor but uses Scopus rather than Web of Science to gather its data and three years rather than two as the publication period.

Look up a journal's CiteScore and compare CiteScores between journals using Scopus's Journal Analyzer Tool.

Journal h-index

SCImago Journal Rank calculates a journal’s h-index, or the number of articles in a journal (h) that have received at least h citations.

Altmetrics

While metrics like the h-index and journal impact factor demonstrate impact through citations, altmetrics can provide additional data points.

Altmetrics can include measures of social media activity, media coverage, and citation in policy or commentary. Departments and disciplines incorporate altmetrics into tenure and promotion decisions variably.

Check with your department for guidance on best practices.

Benefits

  • Gauges impact of research before it enters the citation cycle
  • Demonstrates impact in both the social and scholarly realm
  • Measures the impact of different types of scholarly or creative outputs (e.g., datasets or visual arts)

Drawbacks

  • Lack of agreement on what metrics, data sources, or tools are most valuable
  • Lack of consistent adoption across disciplines and institutions

Popular tracking tools

Tools are fee-based unless otherwise noted.

  • Impact Story
    • Aggregates data from a number of resources, including Mendeley, PLoS, and Scopus, into a holistic profile that allows authors to consider all of their publications in one place.
  • Plum X
    • Analyzes metrics for “any research output that is available online.” The metrics are pulled from various sources, including EBSCO, PLoS, Facebook, Figshare, Dryad, Github, institutional repositories, WorldCat, Amazon, YouTube, and more.
  • Altmetric
    • free bookmarklet or subscription portal that presents the “quantitative measure of the attention that a scholarly article has received… derived from 3 main factors”:
      • volume of mentions (each person who mentions it only counts once)
      • source of the mention (different types of sources--e.g., tweets vs. policy documents--are weighted differently)
      • author of the mention (evidence of bias and expertise in the field are taken into account; non-biased authors are weighed differently than biased authors)
  • Scholarometer
    • freely available browser extension for Chrome and Firefox developed at Indiana University-Bloomington which uses Google Scholar to provide citation analysis data, including number of articles, number of citations, and h-index. Currently in beta mode.