One of the prerequisites for being a scientist seems to be an aptitude for lists — lists of chemical formula, reaction pathways, mathematical relations, and so on. This organizing process is also applied to the scientific enterprise itself.

US News produces an annual league table of universities for prospective students, staff, and other interested parties. The Scientist publishes its own list of top institutions worldwide based on a different set of criteria — as voted for by postdocs. Not surprisingly, the best places to work do not always equate to the most academically highly rated.

The UK takes the ranking process to the extreme. The infamous Research Assessment Exercise rates the research output of every department in an institution with a single number. A similar exercise has also been undertaken for teaching. The results are pored over, by academics and the general public alike, with a mixture of schadenfreude and indignation. National newspapers produce league tables of the ‘best’ and ‘worst’ performers, with much made of rises up (and falls down) the table. Much depends upon these rankings, most notably funding, and many have voiced concerns that the system is unfair and unable to reflect the complexity of the research enterprise. A recent international committee reporting on the state of materials research in the UK were “appalled” that a single number could be used to describe a university department's efforts. And yet, despite such protests, the exercise seems likely to continue.

And it's not just universities that are ranked. A similar auditing process is applied to academic journals — the validity and importance of which is the subject of an ongoing open debate in Nature [see Lawrence, P. A.,Nature (2003) 422, 259 and subsequent correspondence]. The annual publication of impact factors, calculated by the Institute of Scientific Information (ISI), is awaited with trepidation and excitement by academics and publishers alike. So it should come as no surprise that there is a new list to add to the list of lists.

The ISI's list of most highly cited authors now has a category dedicated to materials science. What better accolade than being one of the most cited of your peers? And, of course, it's good news for the profile of a university to have a highly cited faculty member. Pennsylvania State University is particularly delighted with the latest list because it has the greatest number of highly cited researchers — twice as many as its next nearest rival. The list is also a startling demonstration of the dominance of the US, which boasts 142 of the most highly cited materials scientists, compared with its next closest rivals, the UK (15), Japan (13), and Germany (12).

It is human nature to be competitive and compare our efforts to those of others. The whole academic venture is highly competitive, but it does seem particularly prone to quantification. Has an obsession with lists gone too far? Answers rating the importance of rankings on a scale of one to ten on a postcard please…

Read full text on ScienceDirect

DOI: 10.1016/S1369-7021(03)00765-X