Skip to main content

Research Guides

Research Impact & Researcher Identity

Article Impact

Article Level Metrics (ALMs) quantify how individual articles are being cited, used, shared and discussed.

  • Traditional metrics to measure scholarly visibility and
  • Altmetrics to measure social visibility

Traditional metrics measure the number of times an article has been cited. 

Altmetrics measure how an article is being discussed in the public or social realm. Types of measurements include:

  • Views
  • Downloads
  • Mentions

However, ALMs have their own share of limitations. Like journal-based metrics, they may be prone to manipulation and are unable to distinguish between positive and negative attention. 

More information on common ALMs and their limitations is available in the SPARC Primer and Metrics Toolkit

Information and definitions of metrics in this section adapted from Metrics Toolkit. It is licensed under CC-BY 4.0.

Traditional Article Level Metrics

Citation Count

What is it?

The simplest and most common ALM, Citation Count indicates the number of times that an article has been cited by other research outputs since it was published

Limitations

  • Citation count may be different depending on the coverage of the database or tool you use to calculate it. 
  • This metric is influenced by many factors that don't necessarily reflect the impact of an article, including publication date and differences in publishing practices across disciplines.
  • Citation count may be inflated (either intentionally or unintentionally) by the common practice of self-citation in scholarly research.
    • Some tools like Scopus allow you to exclude citations when calculating citation count.

Tools Available for Citation Count

  • Web of Science allows you to track and analyze citation data in the sciences, social sciences, and humanities. It is best used to measure the impact of articles in the sciences. This is a Licensed product offered by the University of Toronto Libraries.
  • Scopus provides citation tracking, visualizations, and analysis tools for articles. This is a Licensed product offered by the University of Toronto Libraries. 
  • Google Scholar provides citation data for a wide range of research outputs, including scholarly articles, conference proceedings and more. Create a profile on Google Scholar Citations to track the number of citations articles that you've published have received. 
  • Publisher Websites: Many publishers are now providing citation counts for articles in their journals, such as:

Field Normalized Citation Impact

What is it?

Field Normalized Citation Impact (FNCI) shows how well an article is cited compared to similar documents in the same field of research. It is calculated as the ratio between the citations received by an article and the average number of citations received by all other similar publications within a particular database. A value greater than 1.0 means the article has been cited more than average compared to similar publications, and a value less than 1.0 means the article has been cited less than average.  

Limitations

  • This metric is calculated based on citation count, and is therefore subject to many of the same limitations.
  • Different tools use different methods for calculating FNCI and therefore articles can only be compared based on values provided by the same tool.
  • Classifications of subject area or discipline are not always accurate, and this can therefore affect the accuracy of the ranking or percentile. 

Tools Available

  • Scopus calculates a Field Weighted Citation Impact (FWCI) for each of its articles, which is "the ratio of the document's citations to the average number of citations received by all similar documents over a three-year window." Similar documents are those with the same year of publication, document type, and disciplines. This is a Licensed product offered by the University of Toronto Libraries. 
  • NIH iCite Database is a research impact tool from the U.S. National Institutes of Health that calculates an article's Relative Citation Ratio (RCR), “a field-normalized metric that shows the scientific influence of one or more articles relative to the average NIH-funded articles.” 

Citation Percentiles and Rankings

What is it?

The position of an article compared to other articles in the same discipline, country and/or time period based on the number of citations they have received. Often expressed as a percentile or a “Highly Cited’ label based on percentile rankings.

Limitations

  • This metric is calculated based on citation count, and is therefore subject to many of the same limitations.
  • Different tools use different methods for calculating citation percentile or rankings. Articles can only be compared based on values provided by the same tool.
  • Like with Field Normalized Citation Impact, classifications of subject area or discipline are not always accurate and this can affect the accuracy of the ranking or percentile. 

Tools Available

  • Scopus includes citation percentile as part of their Article Metrics
    • Under Citation Benchmarking, view the article's citation percentile within each of the fields of research that it has been classified under.
  • SciVal is a benchmarking tool that allows you to measure the research impact of a pre-defined publication set, such as articles published by a particular author, research group, department or institution. Outputs in Top Percentiles demonstrates the number of articles from the publication set that are the among the most-cited publications in Scopus.
  • InCites Essential Science Indicators allows you to view articles in the top citation percentiles of a research field. Papers may receive two designations, which also appear as icons next to the article in Web of Science search results:
    • Hot Papers are in the top 1% compared to other papers in the same field and publication year
    • Highly Cited Papers are articles published in the last 2 years that have been cited the most in the last two months (top 0.1% of articles in compared to articles in the same field and publication year). 

Altmetrics

Many scholars believe traditional metrics do not give the whole picture of research impact, especially in fields outside the sciences. Altmetrics use a range of measurements to show research impact. They measure both impact on a field or discipline and impact on society. They "expand our view of what impact looks like, but also of what’s making the impact. This matters because expressions of scholarship are becoming more diverse" (from http://altmetrics.org/manifesto/). 

Altmetrics can be useful for early-career researchers or new publications that need time to gain citation counts. They also account for other types of publications, such as datasets, code, or blogging. 

This list is not exhaustive, and many more Altmetrics resources can be found on the Metrics Toolkit website. Other Altmetrics include

Altmetric Attention Score and Donut

What is it?

Altmetric tracks mentions of articles and other research outputs such as datasets on social media, news outlets, and bookmarking services. These metrics are visualized using an Altmetric Attention Score and colourful 'donut'. 

  • Altmetric's DonutAltmetric Attention Score is a weighted score of the attention a research output has received. It is based on three main factors: Volume, Sources and Authors. 
  • The donut visualizes where online attention is coming from. Each colour in the donut represents a different type of online output.

Limitations

  • The Altmetric Attention score does not measure the quality of the research, but identifies the level of online activity. 
  • Online attention can be both positive and negative.
  • Altmetrics can be gamed, and bots and web crawlers may impact online activity data. 

Tools available

This metrics can be found in all products offered by Altmetric, including the free researcher bookmarklet and on many journal publisher websites and repositories (such as Figshare). ​

PlumX Metrics

What is it?

PlumX Metrics from Plum Analytics "provide insights into the ways people interact with individual pieces of research output... in the online environment." These metrics are divided into five categories:

PlumX Metrics Plum Print
  • Citations
  • Usage 
  • Captures
  • Mentions
  • Social Media

Limitations

  • Usage and Captures don't always reflect the actual number of readers and users of an article. For example, someone may download, save or favourite a research output but never actually read or meaningfully engage with it. 
  • Altmetrics can be gamed, and bots and web crawlers may impact Usage and Social Media data. 
  • As with other ALMs, PlumX Metrics are unable to make distinctions of quality, and online attention can be both positive and negative.

Tools Available

PlumX Metrics and the Plum Print visualization are available on a wide range of platforms and publisher's websites, including Engineering Village, Mendeley, Scopus and Science Direct.