Journal Citation Reports 2021: Identify World’s Leading Journals With Clarivate’s Journal Citation Indicator (JCI)

How difficult does it get for you to select a journal? What if you were told that identifying world’s leading journals was easier now? Surreal is it? —Not anymore!

With the purpose of enabling the research community to evaluate the world’s high-quality academic journals, Clarivate released its annual Journal Citation Report (JCR) 2021. It uses a range of indicators, descriptive data, and visualizations to analyze and present the data. For over 50 years now, academic publishers, institutions, funders, and researchers have extensively relied on these reports to evaluate the impact of their journals relative to their field and promote them to the research community. Living up to its standards, this year’s edition includes more than 20,000 journals from 113 countries across five continents and 254 research categories in the sciences, social sciences, arts and humanities.

Unveiling of New Metric

For several years, Clarivate Analytics’ Journal Impact Factor (JIF) has been criticized about its metric. The JIF reports average citations per article, and has methodological flaws which support misleading comparisons of journals and researchers. To bring in a new and more reliable system, the company unveiled Journal Citation Indicator (JCI)—an alternative single journal-level metric that can be easily interpreted and compared across disciplines. It is an alternative metric that improves on some of these flaws by allowing more accurate comparisons of journals in different disciplines.

According to Clarivate, JCI accounts for substantially different rates of publication and citation in different fields. However, it did not receive a welcoming acceptance from the critics, who are of the view that the new metric remains vulnerable to misunderstanding and misuse.

JCI averages citations gathered by a journal over 3 years of publications, compared with just 2 years for the impact factor. Furthermore, it includes journals not covered by the impact factor, consisting of some in the arts and humanities, regional journals, and also those from “emerging” scientific fields.

Journal Citation Indictor (JCI) vs Journal Impact Factor (JIF)

While JIF has been around for several years now, JCI is said to be “one step ahead” of it in evaluating journals’ impact. Unlike JIF, JCI is based on a journal’s citation performance across three full years of citation data. It does not provide a snapshot of a journal’s performance in the previous two years.

Additionally, Clarivate will provide the JCI score to all journals in its Core Collection. These would include even those journals that do not currently receive a JIF score.

The persisting numerator-denominator flaw (where all citations to a journal are counted in the numerator, but only “citable items” are counted in the denominator) of JIF will also be avoided by JCI. It will focus entirely on “Articles and Reviews”.

JCI is easy to use and interpret. 1.0 score is the average performance for JCI. This means, that a journal receiving a JCI score of 3.5 has performed three-and-a-half times better than average. However, a journal receiving a JCI score of 0.5 will be considered to have performed only half as good as the average.

Drawbacks of Journal Citation Indicator (JCI)

The JCI system relies on subject classification assigned to a journal on its first appearance on the Web of Science database. Currently, Clarivate uses 235 subject categories. While multi-field classification would make a lot of sense, it will create confusion when JCI scores are first released. The only logical solution to avoid confusion of JCI scores for multi-discipline journals would be to create multiple JCI scores for such journals. However, this would contradict Clarivate’s goal of creating a single JCI score for each journal.

Another pitfall of JCI is that, in this process, a journal editor or publisher who wants to validate a JCI score would need a complete three-year citation record for every paper in every journal within a discipline and every paper from fields in which the journal was cross-listed. Despite the transparency of raw citation record, the entire dataset and methodology for recreating the metric are essentially off limits for most users. This transparency and replicability problem exists in all metrics including JCI, SNIP, Eigenfactor, Normalized Eigenfactor, and Article Influence Score, among others.

Clarivate Analytics Suppresses “10” Journals This Year

With its aim to support objectivity in journal selection and the integrity of the reports, Clarivate has suppressed 10 journals from the JCR this year, representing 0.05% of the journals listed. These journals were suppressed on the basis of Clarivate’s monitoring of journals that demonstrate atypical citation behavior including where there is evidence of excessive journal self-citation and citation stacking. The methodology and parameters for the effect of journal self-citation on JCR metrics were updated in 2020. It led to better account for discipline norms. However, suppression of a journal from JCR does not equate to a de-listing from the Web of Science Core Collection.

Furthermore, an Editorial Expression of Concern has been issued for 11 journals. These have one or more published items with an atypically high-value contribution to the JIF numerator. They also consist of a pattern of journal citations disproportionately concentrated into the JIF numerator. Clarivate urges to continue to review content of this type to develop additional screening for distortions of the JIF.

What “Suppressed’ Journals Have to Say?

Not to anyone’s surprise, the publishers and editors of the suppressed journals expressed their views over Clarivate’s decision.

“In partnership with our editorial community we are committed to publishing the highest quality research. At this point we are looking into the questions raised about the mentioned journals in more detail, so cannot comment further, but will ensure that all questions are addressed appropriately going forwards”, said Alison Mitchell, chief journals officer at Springer Nature, which publishes one of the suppressed journals and four of those that received an expression of concern.

Furthermore, Alan Daugherty, the editor of Arteriosclerosis, Thrombosis and Vascular Biology (ATVB), a title that received an expression of concern for the second year consecutively, said:

“As the publisher for Arteriosclerosis, Thrombosis and Vascular Biology (ATVB), the American Heart Association has continued to work with Clarivate during the past year to gain additional insight into their data regarding citations. Started in 1981 as Arteriosclerosis, ATVB is a specialized journal that gradually expanded to encompass the ever-growing fields of thrombosis and vascular biology. Given the detailed work in these specific yet related research areas, it is unsurprising that ATVB and this article have a higher than usual rate of self-citation. It’s important to note the highly technical basic science research required for this paper, while the library of peer-reviewed research is extremely limited. We appreciate Clarivate’s efforts to monitor self-citation. We are confident in our continually evolving peer-review process that includes careful evaluation of citations.”

What Clarivate Has to Say?

“The carefully selected and structured data within the JCR allows the research community to make better informed, more confident decisions with transparent, publisher-neutral journal intelligence. In 2021, we are proud to introduce new content, a new UI and the Journal Citation Indicator, which is designed to complement the JIF—the original and longstanding metric for journal evaluation—and other metrics currently used in the research community. These latest refinements and additions to the journal intelligence platform’s existing store of resources will further support the research community with trusted insights that can inform decisions and accelerate the pace of innovation.”, said Keith Collier, Senior Vice President of Product, Science at Clarivate.

Should Journals Be Suppressed Like This?

The debate of whether to suppress journals or not has been around for a while now. However, in my view, there is a need for suppression of certain journals in the interest of fairness and accuracy.

With the release of JCI, Clarivate now evaluates journals based on citations by a journal over 3 years of publications. However, a challenge that persists even today is creation of JCI scores for multi-discipline journals. The tedious process of completing a three-year citation record for every paper in every journal is also a concern.

In my opinion, Clarivate’s new method for identifying atypical citation practices may ensure ethical practices in the publication process. With the pandemic bringing the world to new swirls of never-ending question, it is important for every scientist and the allied community to place their trust in published science. However, Clarivate must have a transparent appeal process for suppressed journals. It must provide a fair chance to the journals to explain their reason behind such high levels of self-citation. Further to which, a decision based on an informed judgement must be taken in lieu of direct suppression of journals.

What do you think should Clarivate’s approach be towards a better appeal to suppressed journals? How should suppressed journals respond to this? What do you think of Clarivate’s newly released Journal Citation Indicator? Will it be a paradigm shifting inclusion in the process? Let us know what you think in the comments section below!

Rate this article

Rating*

Your email address will not be published.

You might also like
X

Sign-up to read more

Subscribe for free to get unrestricted access to all our resources on research writing and academic publishing including:

  • 2000+ blog articles
  • 50+ Webinars
  • 10+ Expert podcasts
  • 50+ Infographics
  • Q&A Forum
  • 10+ eBooks
  • 10+ Checklists
  • Research Guides
[contact-form-7 id="40123" title="Global popup two"]





    Researchers' Poll

    Which among these would you prefer the most for improving research integrity?