Academic Journal Rankings Explained
Why are academic journal rankings important? Well, they are one of the first steps for an author is to decide which journal they would like to publish their research in.
Academia has a reputational aspect. This means that authors want to publish their work in the most well-known or highly regarded journals. However, many authors find it difficult to know which journals publish quality research, and which do not.
As a response to this, various independent platforms and indexing databases have created academic journal rankings that compare journal statistics. These statistics include:
- Citation metrics
- Relative standing of a journal in section areas
- General community opinion on the quality of a journal’s publications
Many journal rankings have become highly influential within publishing. They are respected by independent researchers, funding bodies, academic institutions, and the public.
Benefits and Criticisms of Journal Ranking Lists
Journal ranking lists often provide a substantial benefit for authors. They’re trusted platforms that researchers can use to find quality journals.
They allow authors to easily assess information about the quality, impact, and reputation of journals. Authors then have more criteria to use to decide whether or not to submit their research to a specific journal.
Rankings also encourage publishers to improve their editorial practices and impact in the scientific community. If publishers want to increase their journal’s ranking, they’ll need to focus much more on improving the quality of their publications. They need to be proactive when promoting the journal and the overall reputation of their journal.
Therefore, ranking journals can often provide a win-win situation for both authors and publishers, producing mutual benefits through this system of “rewarding” journals for their high impact.
However, these ranking lists can also sometimes be contentious. One of the main criticisms of these lists is that exceptional research can be found in many journals, regardless of overall ranking.
Additionally, some lists have been found to be subjective. They are heavily influenced by the personal or institutional opinions of a small group of evaluators. This can lead to biases towards not only just a specific journal, but also towards certain publishers or entire publishing models.
Furthermore, these ranking lists often create cycles of exclusion. They strongly favour older and more established journals. When a journal is well-ranked, more authors will choose to publish with that journal. Consequently, more researchers will read and cite their publications. This then leads to an increase in the overall ranking of the journal. This cycle can prevent younger, less-established journals from becoming well-known and well-cited, even though the quality of publications may be comparable.
The Different Types of Journal Rankings
There are a range of different academic ranking lists, but they can be broadly divided into two main categories:
Governmental or institutional ranking lists
These are created by governmental bodies within specific countries. They are often produced with the goal of influencing authors from that country, and their respective funders, to strongly consider or explicitly restrict publication to certain journals of a specific rank. Some examples of governmental ranking lists are
- Italian National Agency for Evaluation of the University and Research System (ANVUR)
- Norwegian Register for Scientific Journals, Series and Publishers
- Finnish Publication Forum (JUFO)
Indexing ranking lists
Created by indexing companies or related platforms, these lists are often produced using objective calculations to give each journal a score. This score is primarily focused on a journal’s average citation metrics.
These calculated metrics can often be highly detailed, with a wide range of data used to compare journals.
Some examples of indexing ranking lists and their scores are
- Journal Citation Reports (Web of Science) which produces the Impact Factor
- SCImago Journal & Country Rank which produces the SJR
- CiteScore metrics (Scopus)
Most Important Journal Ranking Lists
Each author or institution values ranking lists differently. However, the two biggest and most well-known are the indexing ranking lists by Web of Science and Scopus. They produce the famous citation scores known as Impact Factors and CiteScores, respectively.
Web of Science
Clarivate, an American analytics company, runs the Web of Science indexing database, which is the second-largest database in the world. They produce an academic journal ranking list each year. It’s called the Journal Citation Reports (JCR). All journals indexed in Web of Science are ranked according to numerous metrics. These metrics include:
Impact Factor | Journal-level citation metric score that indicates the average number of citations per paper within a journal, from the past two years (given to all journals indexed in the Web of Science Core Collection, as of June 2023). |
5-Year Impact Factor | Average number of times that research from a specific journal published in the past five years has been cited in the JCR year. |
Journal Citation Indicator | Average Category Normalised Citation Impact (CNCI) of citable items (articles and reviews) published by a journal over the past three years. |
Total Citations | Total number of times that a journal has been cited by all journals included in the database that year. |
Cited Half-life | Median age of the items in a journal that were cited that year. |
Total Citeable Items | Total number of articles and reviews published by a journal in the past two years. |
Eigenfactor Score | Density of the citation network around the journal using five years of cited content, as cited by that year. |
Normalised Eigenfactor | The eigenfactor score, but normalised (total number of journals in the JCR each year rescaled, so that the average journal has a score of 1). |
Article Influence Score | Normalises the eigenfactor score according to the cumulative size of the cited journal across the past five years. |
Immediacy Index | Number of annual citations that reference content in the same year. |
* Definitions from Journal Citation Reports (Clarivate).
Scopus
Scopus is the largest indexing database in the world, and is run by the publisher Elsevier. All journals indexed in Scopus are ranked according to a few metrics. These include:
CiteScore | Journal-level citation metric score that indicates the average number of citations per paper within a journal, from the past three years. |
Source Normalised Impact per Paper (SNIP) | Actual number of citations received relative to the citations expected for the journal’s subject field. |
SCImago Journal Rank (SJR) | Measures weighted citations received by the journal, depending on the subject field and prestige of the cited journal (this metric is actually created by SCImago, but it is also displayed on Scopus’ journal ranking lists). |
* Definitions from Scopus (Elsevier).
Very concise and clear. Good job.
In my humble opinion, these ranking are volatile (especially for those in the 80% percentile and below) and, if thoroughly explored, they provide counterintuitive journal recommendations. I noticed that the Scopus and WoS ranking sometimes do not agree with each other, and it is not easy to find out what to do…
These rankings tend to oversimplify the perusal process, as they substantially focus on the number of citations (while more accurate assessments of how to correctly classify journals based on their actual expertise are not forthcoming in most cases). Moreover, the number of citations can be subject to fads even within the same field of study.
This is the reason why many researchers are gradually paying more attention to what national authorities recommend to researchers rather than what citation counting websites actually say. But sad to say, it is sometimes the only remedy to fight “bibliometric” biases and to foster innovative research, as we cannot tolerate that peer-reviewed articles are evaluated without any consideration of their scientific soundness (or without the understanding of its relevance to the topic). We may potentially have both high-quality papers in less-established journals and rubbish studies in well-reputed journals as well. I am glad that all the publishers are now carefully scrutinized (more than before), and MDPI is currently becoming one of the largest publishers globally (and I think it will lead the way).
Hello,
Thank you for taking the time to read our article and leaving a comment; it is much appreciated.
All the best.
This has helped me a lot. Thank You
Please share the list of countries which uses Governmental or institutional ranking lists other than that you have mentioned here
Italian National Agency for Evaluation of the University and Research System (ANVUR)
Norwegian Register for Scientific Journals, Series and Publishers
Finnish Publication Forum (JUFO)
and also provide list of countries which follow Indexing ranking lists…
Thanks
Hi Abdul, thank you for the comment.
It’s not possible to list all of the ranking lists from around the world. There are simply too many. You can contact your institution for details of lists specific to your country.
Chemistry journals??
Hi Shabir, thank you for getting in touch.
The latest Impact Factors can be found here: https://www.mdpi.com/about/announcements/4095
You can see what category they are ranked in by looking in the right-hand column. We have many journals that are ranked in the Chemistry category.
Worth sharing 👍🏻👍🏻
Thank you for reading and sharing, Muhammad!
Thanks a lot. Keep up the good work. I have been publishing with MDPI in the past 2 years and I will continue as long as they impact continue to increase.
Hi Flavien. Thank you for being an MDPI author!
Can you share the latest list (2022)
Hi aakashsaxena24. Thank you for the comment.
Here’s a list of this year’s Impact Factors: https://www.mdpi.com/about/announcements/4095
Please share the list of new JCR
Hi Shabir. Thank you for the comment.
The 2021 Impact Factors were released yesterday. Here’s a full list: https://www.mdpi.com/about/announcements/4095
Thanks for briefings
Thanks for reading!
This is very informative. Concise and clear. Thank you.
Hi Rob. Thank you for reading the MDPI Blog. Glad you found this one useful.