University league tables: flawed but not to be ignored

Domestic success is not enough for modern universities

In the 2014 UK university league tables, Durham University ranks a respectable 5th in the Complete University Guide, 6th in the Guardian University rankings and 6th in the Times Good University Guide. Yet in the QS World University rankings, Durham ranks only 90th. It is beaten by sixteen other British universities, most of which rank below it in the UK tables. Why the gross inconsistency in results at the national and international level?

Clearly, the criteria for each ranking differ. The Guardian, for example, ranks universities on nine criteria: average teaching score, NSS teaching, NSS overall, spending per student, student to staff ratio, career prospects, value added score, average entry tariff and NSS feedback. The QS World Rankings are based on five criteria: academic peer review, faculty student ratio, citations per faculty, recruiter review and international orientation.

This does not mean that one ranking is necessarily more credible than the other. None are without fault. One could easily be critical of the QS rankings, arguing that the academic peer review allows for self interest from peers, that faculty to student ratio is not a fair substitute for teaching quality, and that by ignoring the average entrance criteria for institutions, the QS world rankings do not take into account the quality of students entering the university. By contrast, one could also assert that the emphasis on the results of the National Student Survey (NSS) in The Guardian league table means that results could be distorted, as it is in the interest of students to rank their university as highly as possible.

In addition, no league table has yet assessed student ability upon graduation, perhaps one of the most important indicators of quality of a university. Although notoriously difficult to establish, the Assessment of Higher Education Learning Outcomes (AHELO) conducted by the OECD aims to do just this. The third and final volume of the report, due to be published in September 2013, aims to determine what students know and can do upon graduation in 248 institutions across 17 different countries. It has emphasised, however, that the data collected will be provided only to the universities and will not allow for evaluation at a national level. Pity, as the prospect of a ranking based purely upon the quality of graduates may be a more effective and useful method of determining the excellence of a university than any league table published thus far.

Despite their obvious flaws and inconsistencies, there is no excuse for Durham to ignore the results of the world rankings. If Durham is to remain a successful educational institution, it should be achieving high scores in all criteria, not just those relevant to the UK league tables. Seemingly regardless of their actual worth, students, employers and academics alike attach a huge amount of importance to world university league tables. Students attaining top grades aspire to attend world-class universities, top employers aim to hire world-class graduates, and academics want to conduct their research at world-class institutions.

So where exactly does Durham fall down in the QS rankings, and what can it do to improve? The University actually ranked extremely highly for employer reputation, achieving a score of 98.7, which places it 23rd in the world in this criterion. It also achieved good scores in the academic reputation, international faculty and international student criterion. It fell down in two areas. It received a score of 60.4 for citations per faculty and an extremely low score of 48.9 for faculty student ratio.

These are clearly the two areas in which it is necessary to improve in order to climb the QS table. Yet the University’s impressive statistics in employability and academic reputation contrast strongly with its low score in faculty to student ratio, which is intended to be a measure of teaching quality in the QS ranking. How can an institution with such a poor rating for teaching quality possible produce such a high standard of academia as well as such sought after graduates? It is inconsistencies such as these that further emphasise the fact that statistics such as faculty to student ratio are an extremely poor surrogate for teaching quality. However, as long as the QS rankings continue to use such a measure, it is important that the University take heed of such statistics and improve them, if they are to climb the rankings.

It may also be pertinent here to mention that Durham could do a little more to market itself more effectively. The website would be a good place to start. ‘A world top 100 university,’ reads the tagline, as if this statistic is supposed to impress. Its outdated website, with blocks of dull colour and static graphics, does little to promote what is truly a colourful and dynamic place to study. It has a beautiful campus, and yet there are few pictures of it online. Despite the exciting range of research being carried out, little is done to endorse it. Durham rarely features in the news and lacks the advertising exploited by other universities. Some facilities are still cramped into old buildings, a long way out from the main university campus. Peeling paint, small rooms and crumbling décor are hardly positive indicators of a flourishing and dynamic university for prospective students. The University is simply failing to nurture its image.

Durham is clearly failing to perform upon the world stage. Currently the University has no problem attracting this country’s top students; it is hugely oversubscribed and has the fifth highest average entrance tariff in the country. Nevertheless, if it continues to rank poorly in the world ratings, it will slowly but surely begin to lose the interest of both British and international students, academics and investors. Durham needs to market itself on a more global scale, or risks being left behind.

Leave a Reply