UCC Library is committed to reopening both Boole and Brookfield libraries as soon as it is safe to do so: the safety of library users and staff is the chief guiding principle. Read more about our plans and schedule for re-opening our Libraries .
Please see our policies regarding re-opening here
Metrics for journals based on citations received have been devised by the major citation databases and others. These are used to rank the journals of a particular subject area.
Journal rankings can be an aid to distinguish between potential publication venues but should never be used to make judgment on the quality of an individual article published within a journal. The most popular metrics in use are calculated as averages and so they can be skewed by a few highly cited articles. Many scholarly articles are never cited, even in high impact journals as this graph from Scimago Journal Rankings for Nature Reviews Cancer shows.
These rankings have contributed to the notion of "prestigious journals" and led to misappropriations of the metrics when they are used to make decisions around job applications and promotion. As the San Francisco Declaration on Research Assessment says
"Do not use journal-based metrics, such as Journal Impact Factors, as a surrogate measure of the quality of individual research articles, to assess an individual scientist’s contributions, or in hiring, promotion, or funding decisions”.
The Journal Impact Factor (JIF) is the longest established and most well known of the journal rankings. It was originally devised to help librarians make decisions on serials purchasing but now is used to assess the importance of a journal in a particular field. Journals indexed by the Web of Science Group in their Science Citation Index Expanded (SCIE) and Social Sciences Citation Index (SSCI) are evaluated annually in the Journal Citation Reports.
"The Journal Impact Factor is defined as all citations to the journal in the current JCR year to items published in the previous two years, divided by the total number of scholarly items (these comprise articles, reviews, and proceedings papers) published in the journal in the previous two years. Though not a strict mathematical average, the Journal Impact Factor provides a functional approximation of the mean citation rate per citable item. A Journal Impact Factor of 1.0 means that, on average, the articles published one or two years ago have been cited one time.”
In the screen shot here, we see the calculation of the 2018 Journal Impact Factor for Annual Reviews of Plant Science. (Journal Citation Reports Science Edition, Web of Science Group, 2019)
Evaluations between journals should always be made within categories as citation and publication behaviour across disciplines can vary hugely. In the following screenshot, you can see the ranking for journals in the Web of Science "Plant Sciences" category.
CiteScore is a journal impact metrics based on Scopus citation data.
CiteScore is the number of citations received by a journal in one year to documents published in the three previous years, divided by the number of documents indexed in Scopus published in those same three years.
In the following screen shot , we see the ranking of the 2018 CiteScore for the Forestry subject area in Scopus. (Scopus Sources, Elsevier, 2019). As with JIF, evaluations between journals should always be made within categories as citation and publication behaviour across disciplines can vary hugely. The CiteScore Percentile is helpful with interpretation as it indicates the relative standing of a serial title in its subject field. The Percentile and Ranking are relative to a specific subject area.
The Source Normalized Impact per Paper (SNIP) is the ratio of a journal's average citation count per paper, and the 'citation potential' of its subject field.
It measures the average citation impact of the publications of a journal and corrects for differences in citation practices between scientific fields, thereby allowing for more accurate between-field comparisons of citation impact than can be done with JIF and CiteScore. The indicator uses Scopus citation data and is devised by CWTS.
SCImago Journal Rank (SJR) expresses the average number of weighted citations received in the selected year by the documents published in the selected journal in the three previous years.
SJR looks at the prestige of a journal, by considering the sources of citations to it, rather than its popularity as measured simply by counting all citations equally. Each citation received by a journal is assigned a weight based on the SJR of the citing journal. A citation from a journal with a high SJR value is worth more than a citation from a journal with a low SJR value. SJR rankings are based on Scopus citation data.
Google Scholar Metrics rank journals based on citations from all articles that were indexed in Google Scholar. The journals are categorised by subject areas and are ranked by their h5-index. This is the h-index for articles published in the last 5 complete years under review. It is the largest number h such that h articles published in the 5 years under review which have received at least h citations each. For example, a journal with a h5-index of 53 for 2019, has had 53 articles which were published 2014-2018 which had received at least 53 citations each.
Google Scholar Metrics can be useful for subject categories not covered well by other citation databases, e.g. Humanities, Literature & Arts.
The Chartered Association of Business Schools has developed a ranking of business and management journals, the Academic Journals Guide which is not fully derived from citations. It “is based upon peer review, editorial and expert judgements following from the evaluation of publications, and is informed by statistical information relating to citation”.
Journals are assigned a star rating from 1 to 4 with an addiitonal "Journals of Distinction" for the top ranked journals. The guide is published every 3 years.