Most European campuses ‘use journal impact factor to judge staff’

Preliminary results of EUA survey suggest three-quarters of responding institutions draw on much-criticised metric

Published on
October 3, 2019
Last updated
October 25, 2019
horse jumping
Source: Alamy

POSTSCRIPT:

Print headline: Citation metrics reign supreme

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Please
or
to read this article.

Related articles

Reader's comments (4)

One needs to examine the publications more carefully, instead of accepting them at face value. At Curtin University in Western Australia the Head of School in Management published his own journal to ensure that selected students would rack up a sufficient number of publications so that they would be promoted. None of this was properly "peer reviewed" and the number of journals printed for each edition was less than 10. None of these publications were ever sold, but given to the mates of the Head of School. There was a big issue with plagiarism during this Head of School's tenure, and bribery between students and staff was investigated. Be careful what you wish for!
Even for papers published in highly ranked journals, the mode for the number of citations for most published papers in those journals is '0' or '1'. So if the impact of each paper is scrutinised for its impact, the overall impact will drop. This indicates that there is an overall inflation of impact via looking at journal impact factor. This partly explains why researchers are resistant to other more holistic indices.
It is truly pathetic when only just over 60% of universities say that supervision and teaching are important measures of academic success. Have they completely forgotten what "the academy" is all about? If they only care about research, they should be research institutions, not universities.
It is true, publication rate, grant income and citation rate are very imperfect measures of academic productivity. So, the solution is a simple one ...... suggest alternative and better (more objective and more quantitative) measures to use instead. Or, could it be that quantification of performance in academia a flawed idea in itself?

Sponsored

Featured jobs

See all jobs
ADVERTISEMENT