Is Research Advancement Based on Quality or Quantity?

When new undergraduate students contemplate the academic career path ahead of them, the journey can appear to be quite steep. Successful completion of a bachelor’s degree enables entry into a graduate or Master’s program, with successful completion of that level enabling access into a post-graduate or doctoral program to earn a terminal degree. The academic performance expectations are higher at each level, and the doctoral degree is referred to as “terminal” because there is no higher degree available, just the opportunity for postdoctoral research. The path may be straightforward, but the amount of work involved is not, and for those researchers looking to build a lifelong career in academia, the question then arises as to how your performance can be measured once you have reached the post-doctorate level and begin applying for full-time research positions.

The Currency of Citations

The endless pursuit of better institutions with better facilities and better research opportunities is a characteristic of the career path for younger researchers. At each interview, your resume is reviewed, your past studies discussed, your references examined, and your goals and aspirations shared. The deciding factor, however, is often one metric: how many times has your work been accepted for publication. If the hiring decision comes down to two finalists, that question gets further clarified by where your work has been published, with the expectation that the more prestigious the journal, the more impressed the hiring manager will be. Citations then become your constant friend as you learn to monitor not only where your work has been published, but also how many times that work has been cited by your colleagues around the world.

Trading Beads and Shells

 For an industry that places so much importance on research publication, the attention paid to the validity of that information is surprisingly weak. Researchers now monitor their ORCID accounts regularly to ensure that all relevant citations of their published works are being captured, and many can quote their h-index with the same accuracy as their credit score. However, all of these calculations and rankings are based on a single metric that is becoming frustratingly easy to manipulate.

Bootleg Citations

Just as recording artists and movie studios worry about the black market for bootleg copies of their work, academic publishing is being forced to pay attention to the veracity of many citations that both journals and researchers present as evidence of their academic prowess. Getting an article accepted for publication in a prestigious journal still carries weight on a professional resume, but embellishing that achievement with a high volume of subsequent citations is proving to be too much of a temptation for journal editors and researchers alike. Citation stacking among a family of journals under one publishing house can artificially inflate the volume of citations and the subsequent ranking of that family of journals. Citation rings enable a group of researchers to collude by agreeing to cite each other’s work with abnormal regularity. On occasion, there are small groups of specialist researchers who are called upon to cite each other, as a limited group of experts in a particularly narrow field, but deliberate attempts to inflate the number of citations is clearly scientific misconduct.

As long as the number of citations remains a checklist item for research applicants, the temptation to inflate that number will remain. Verifying citations in the same way as qualifications may not be an option, but putting research quantity over research quality exposes journals and research institutions to the risk of scandal and loss of reputation if any of that published work gets retracted in the future as a result of citation fraud.

Rate this article

Rating*

Your email address will not be published.

You might also like
X

Sign-up to read more

Subscribe for free to get unrestricted access to all our resources on research writing and academic publishing including:

  • 2000+ blog articles
  • 50+ Webinars
  • 10+ Expert podcasts
  • 50+ Infographics
  • Q&A Forum
  • 10+ eBooks
  • 10+ Checklists
  • Research Guides
[contact-form-7 id="40123" title="Global popup two"]





    Researchers' Poll

    What features do you prefer in a plagiarism detector? (Select all that apply)