Cookies

We use cookies to ensure that we give you the best experience on our website. You can change your cookie settings at any time. Otherwise, we'll assume you're OK to continue.

Durham University

University Library

Overview of (Biblio)metric Research Indicators

Bibliometrics provide methods of extracting measurable data around publication and citation activity as one possible indicator of research quality, productivity or reach.

Citation data is often used by authors, employers, funders, publishers and in university league tables to evaluate the ‘impact’ of those publications within the academic community.

It is useful for Durham authors to be aware of and understand what metrics are available, where and how they might be used, the limitations of various metrics and the tools available to you. This page summarises some of the key data sources, and commonly used bibliometric indicators.


Citations

A Citation is a reference provided by an author to direct a reader to a published or unpublished source or underpinning set of data, usually for the purpose of acknowledging their relevance to the topic of discussion.

Citation Counting

The number of citations an article receives is one indicator of the "academic impact" of the article, providing an indication of its popularity (or reach) in terms of how many people have read and then applied or referred to that research. A high citation count is not a direct indication of high quality, however. Read about the Limitations of metrics for further information.

Citation Searching and Citation Alerts

It is possible to track when newly published research cites a published research output you are already interested in.

This could be useful to:

  • Track developing discussions or applications of research or methodologies within your field of interest.
  • Track where and how your own published work is being cited within the academic community.

For further information, see our pages on Citation Searching.


Citation Indices

In order to monitor citations, you need an as comprehensive citation dataset as possible to make the collection, counting and analysis in any way meaningful.

Below are four key sources of citation data available:

Open Citations (via Library Discover)

The Initiative for Open Citations (I4OC) is a collaboration between scholarly publishers, researchers, and other interested parties to promote the unrestricted availability of scholarly citation data.

It recognises that in order to best enable researchers, and the wider public, to keep up with new and significant developments in any field, it is "essential to have unrestricted access to bibliographic and citation data in machine-readable form" and that citation data are "not usually freely available to access, they are often subject to inconsistent, hard-to-parse licenses, and they are usually not machine-readable".

Further information about I4OC.

Open Citation data is provided by many academic publishers and may be accessed within a few days through the Crossref REST API (which is fed into our Library Discover service).

Web of Science

Previously provided by Thomson Reuters, and now by Clarivate Analytics, the Web Of Science is the original 'Citation Index" for published academic research, originating with the Science Citation Index (SCI) in 1964, and later followed by the 'Arts and Humanities Citation Index' and the 'Social Sciences Citation Index'.

Further information about Web of Science content coverage.

Citation data from the Web of Science is used to calculate the:

  • Journal Impact Factor (JIF),
  • Eigenfactor
  • other metrics published in the Journal Citation Reports (JCR).
Scopus

Provided by Elsevier, launched in 2004 as a competitor to Web of Science.


Further information about Scopus content coverage.

Citation data from Scopus is used to calculate the:

  • Citescore,
  • Scimago Journal Rank (SJR)
  • Source normalized impact per paper (SNIP)

Citation data from Scopus was used in REF2014, and forms part of the calculations used by the:

  • (THE) World University Rankings
  • QS World University Rankings.
Google Scholar

Unlike Web of Science and Scopus (which require subscription access), this is a free to access service which provides citation data.

See our blog post on citation data in Google Scholar, and why citation counts are often higher in Google Scholar than anywhere else.

Many academics create a Google Citations Profile to track citations for their own publications, or use Publish or Perish (free software) to download and calculate various metrics from the data available.

Journal Impact Factor (JIF)

Calculated from the previous 2 year’s worth of citation data found in the Web of Science (Clarivate Analytics) database. It gives an approximate measure for the average number of citations articles published in that journal over 2 years have received, in that year (So a 2015 JIF is the average number of citations received in 2015, for articles published in 2013-14). Citations are not weighted, nor can you draw any conclusions from comparing journals across subject boundaries as it will not take into account differences in publication or citation culture.

Journal Impact Factor methodology

Further information: http://wokinfo.com/essays/impact-factor/

JIF Scores: Available via Web Of Science (Journal Citation Reports) - Library Subscription

How do I?: View the JIF (or Eigenfactor) of a journal or journals in my research field?

Limitations on use: https://www.dur.ac.uk/library/research/evaluate/limitations/


Eigenfactor

Calculated from the previous 2 years of citation data as curated by the Journal Citation Reports (Web of Science (Clarivate Analytics) database). Citations are weighted based upon where they come from. Eigenfactor scores are scaled so that the sum of scores for all journals listed in the JCRs total 100, so that a journal with an Eigenfactor score of 1.0 has 1% of the total “influence” of all indexed publications. There are over 11,000 journals ranked, with PLoS One having the highest Eigenfactor Score as of 2019 (with a score of 1.70677, compared to Nature's 1.28501).

Further information: http://www.eigenfactor.org/index.php

JIF Scores: Available via Web Of Science (Journal Citation Reports) - Library Subscription

How do I?: View the JIF (or Eigenfactor) of a journal or journals in my research field?

Limitations on use: https://www.dur.ac.uk/library/research/evaluate/limitations/


CiteScore

Calculated from the previous 3 year’s worth of citation data found in the Scopus (Elsevier) database. Launched in December 2016, 'Citescore' is similar to the JIF - but is updated monthly as well as annually. It gives an approximate measure for the average number of citations articles published in that journal over 2 years have received in that year (So a 2016 Citescore is the average number of citations received in 2016, for articles published in 2014-15). Citations are not weighted, nor can you draw any conclusions from comparing journals across subject boundaries as it will not take into account differences in publication or citation culture.

Citescore Methodology

Further information: Elsevier press release

Citescore Rankings: Available via Scopus Journal Metrics

How do I?: View the Citescore, SNIP or SJR of a journal or journals in my research field?

Limitations on use: https://www.dur.ac.uk/library/research/evaluate/limitations/


SCImago Journal Rank (SJR)

Calculated from the previous 3 year’s worth of citation data found in the Scopus (Elsevier) database. Citations are weighted based upon where they come from (a journal with a higher or lower SJR), and normalised based upon the set of documents which cite its papers, thus providing a ‘classification free’ measure for comparison.

Further information: http://www.scimagojr.com/

SJR Scores: Available via Scopus Journal Metrics

How do I?: View the Citescore, SNIP or SJR of a journal or journals in my research field?

Limitations on use: https://www.dur.ac.uk/library/research/evaluate/limitations/


Source-Normalised Impact per Paper (SNIP)

Calculated from previous 3 years of citation data found in the Scopus (Elsevier) database. A journal’s ‘subject field’ is taken into account, normalising for subject specific citation cultures (average number of citations, amount of indexed literature, speed of publication) to allow an easier comparison of scores for journals between different subject areas.

Further information: https://www.elsevier.com/solutions/scopus/features/metrics

SNIP Scores: Available via Scopus Journal Metrics

How do I?: View the Citescore, SNIP or SJR of a journal or journals in my research field?

Limitations on use: https://www.dur.ac.uk/library/research/evaluate/limitations/


h-index

The Hirsch index (or Hirsch number) was first proposed in 2005 as a measure for the academic productivity and impact of a researcher's publications over their career. An author's h-index will increase over time, as they publish more papers and their published papers attract more citations.

The h-index is defined as follows:

"An author has an h-index of h, if a number h of their papers have h or more citations"

Example: An author has published 22 publications. Of these publications, at least 8 have received at least 8 citations each. The author does not have 9 publications which have received at least 9 citations. Therefore, that author has an h-index of 8.

H-index graph

How do I?: Calculate my h-indes using Scopus, Web of Science or Google Scholar data?

Limitations on use: https://www.dur.ac.uk/library/research/evaluate/limitations/

A metric for Publication Count (defined as Scholarly Output) is included in the Snowball Metrics Recipe Handbook


Alternative author metrics to the h-index

The h-index is not a useful metric for early career researchers, amongst other criticisms of its usefulness. Some alternative metrics you might want to consider include:

  • Publication and/or citation count
  • Citation impact (Mean Citations per publication)
  • % Outputs in Top percentiles
  • % Outputs in Top Journals
  • % Outputs cited

Alternatively , there are several proposed variations on the h-index which are sometimes used or referred to.


g-index

The g-index, proposed by Leo Egghe in 2006, us similar to the h-index but aims to take some account of any highly-cited papers.

The g-index is defined as follows:

"[Where a given set of articles are] ranked in decreasing order of the number of citations that they received, the g-index is the (unique) largest number such that the top g articles received (together) at least g2 citations."

Example: An author has published 22 publications. Of these publications, the sum of the citations of the top 12 articles (by number of citations) is equal to or over 144 12 squared) citations. The sum of the citations for their top 13 articles (by number of citations) is less than 169 (13 squared) citations however. Therefore their g-index is 12.


m-index

The M-index, or M-quotient, was also proposed by Hirsch in 2005. It aimed to allow a more fair comparison between academics of differing career lengths.

An author's m-value is found by dividing their h-index by the number of years the author has been actively publishing (measured as the number of years since their first published paper).

Example: An author with an h-index of 18 who has been actively publishing for 6 years will have an m-index of 3. An author with an h-index of 30 who has been actively publishing for 15 years will have an m-index of 2. If the two author's are publishing in the same field of study, this may give a more fair way of comparing the impact of the author's publication output over the length of each of their publishing careers.


Most of the metrics below can be calculated for an individual article, or any collection of articles (e.g. all publications by a single author, produced by a research group, journal, academic department or institution. Most can be derived from data provided by Scopus or Web of Science, or from a citation analysis service such as SciVal (which uses Scopus data). See also common uses (and limitations) of these as summarised here.

Publication count

The most basic metric which can be used as a measure of productivity is the number of publications produced by an individual, or group of individuals.

A metric for Publication Count (defined as Scholarly Output) is included in the Snowball Metrics Recipe Handbook


Citation count

The total sum of citations received by an author's research outputs, or a group of researcher's outputs.

A metric for Citation Count is included in the Snowball Metrics Recipe Handbook


Citation Impact (Mean Citations per publication)

The mean citation rate of a group of research outputs.

A metric for Citation Impact (defined as Citations per Output) is included in the Snowball Metrics Recipe Handbook


Cited publications

Either a total number of publications which have received at least 1 citation, or a percentage of total publications which have received 1 or more citations.


Field-weighted Citation Impact (FWCI) ~ calculated from Scopus citation data

A comparison of the actual number of citations received by a single output, or large group of outputs, with what might have been the expected number of citations they would receive, based upon the mean number of citations received by all other similar publications (e.g. normalised by output type, output age and field of study).

  • A FWCI of 1.00 indicates that a group of outputs have been cited exactly in line with the global average for similar outputs.
  • A FWCI of 1.82 indicates that a group of outputs have been cited 82% more than the global average for similar outputs.
  • A FWCI of 0.77 indicates that a group of outputs have been cited 23% less than the global average for similar outputs.

The FWCI is included in the Snowball Metrics Recipe Handbook


% Outputs in Top percentiles

The % of a group of outputs which are in the global top 1/10/25% most cited outputs.

A metric for Outputs in Top Percentiles is included in the Snowball Metrics Recipe Handbook


% Outputs in Top Journals

The % of a group of outputs which are in the global top 1/5/10/25% of journals, when ranked by an identified journal metric (eg by JIF, Citescore, SJR or SNIP).

A metric for Outputs in Top Journal Percentiles is included in the Snowball Metrics Recipe Handbook


Collaboration Impact metrics (based on co-authorship of outputs)

Some metrics may also look at the Citation Impact of outputs within a group of outputs, which have a co-author with an affiliation which does not belong to the parent group.

For example, this might offer a comparison of the Citation Impact of a group of articles with international (e.g. where a co-author's affiliation does not belong to the author's institution and is outside that institution's country) or corporate co-authors, compared to the Citation Impact of the whole group of articles.

Metrics looking at collaboration and academic-corporate collaboration are included in the Snowball Metrics Recipe Handbook


Altmetrics

Traditional bibliometrics focus on traditional scholarly activity in the form of citations. These can have a number of limitations in providing a full picture of the impact of a scholarly output. Altmetrics track broader non-academic impact, including:

  • Mentions and interactions on social media (Twitter, Facebook etc.)
  • Mentions in traditional online media (newspapers, news outlets and other media)
  • Mentions on blogs, wikis and other services (including citations in Wikipedia articles, which may be curated by academic and non-academic specialists)
  • Readers on services such as Mendeley, CiteULike and Reddit.
  • Citations in policy papers, reports and other grey literature.

See our Altmetrics page for further information.


Your Academic Liaison Librarian

James Bisset

Academic Liaison Librarian
Research Support

james.bisset@durham.ac.uk

0191 334 1589

DU Library Blog

Metrics Top Tips

  1. Always use quantitative metrics together with qualitative inputs, such as expert opinion or peer review.
  2. Always use more than one quantitative metric to get the richest perspective. 
  3. If comparing entities, normalise the data to account for differences in subject area, year of publication and document type.

See our pages on Responsible Metrics for further information.