(Go: >> BACK << -|- >> HOME <<)

Why Ads

To Advertise

The University League Table methodology 2011

To view the latest University League Table 2011 click here >

To view the latest University Subject Tables 2011 click here >

Introduction and background to the league tables


The first university league tables were published nearly 20 years ago and have continued to be the subject of vigorous debate among academics ever since. Given that analysis of this type within higher education and elsewhere - in schools, health etc - have come to be seen as legitimate aids, it is perhaps surprising that many universities remain opposed to the very notion of comparing one with another, and yet that is what applicants have to do all the time. They claim in defence that each is unique, has a distinct mission and serves a different student community. Be that as it may, universities have been known to quote favourable league table rankings when these assist their cause. We remain convinced that comparisons are valid and helpful to students and their mentors when it comes to choosing a university.

The raw data for the League Table all come from sources in the public domain. The Higher Education Statistics Agency (HESA) provided data for entry standards, student-staff ratios, spending on academic services, facilities spending, good honours degrees, graduate prospects, completion and overseas student enrolments. HESA is the official agency for the collection, analysis and dissemination of quantitative information about the universities.


The Higher Education Funding Council for England (HEFCE), along with the Scottish Higher Education Funding Council (SHEFC) and the Higher Education Funding Council for Wales (HEFCW), are the funding councils whose remit it is to develop policy and allocate public funds to the universities. The 2008 Research Assessment Exercise, conducted by the funding councils, provides the data for the research assessment measure used in the Table. The funding councils also have a statutory responsibility to assess the quality of learning and teaching in the UK universities they fund. In England, Wales and Northern Ireland, the funding councils oversaw the National Student Survey, a major survey of the views of final year students of the quality of the courses they were studying on. We use the outcomes of this survey as a measure of student satisfaction.


In a few cases the source data were not available and were obtained directly from the individual universities.


All universities were provided with complete sets of their own HESA data well in advance of publication. In addition, where anomalous figures were identified in the HESA data, institutions were given a further opportunity to check for and notify any errors. Similarly, we consulted the universities on methodology. Once a year an Advisory Group with university representatives meets to discuss the methodology and how it can be improved. Thus, every effort has been made to ensure accuracy, but no responsibility can be taken for errors or omissions. The data providers do not necessarily agree with data aggregations or manipulations we use and are also not responsible for any inferences or conclusions thereby derived.
This analysis of the results of the Research Assessment Exercise 2008 makes use of contextual data supplied under contract by the Higher Education Statistics Agency (HESA). It is a contractual condition that this statement should be published in conjunction with the analysis. HESA holds no data specifying which or how many staff have been regarded by each institution as eligible for inclusion in RAE 2008, and no data on the assignment to Units of Assessment of those eligible staff not included. Further, the data that HESA does hold is not an adequate basis on which to estimate eligible staff numbers, whether for an institution as a whole, or disaggregated by Units of Assessment, or by some broader subject-based grouping.

A particular feature of this Table is the way the various measures are combined to create a total score. All the scores have undergone a Z-transformation, a statistical method for ensuring that each measure contributes the same amount to the overall score and so avoids the need for scaling. (For the statistically-minded, it involves subtracting the mean score from each individual score and then dividing by the standard deviation of the scores.)


Another feature of the Table is that four of the measures have been adjusted to take account of the subject mix at a university. A university with a medical school, for example, will tend to admit students with a higher tariff score than one without simply because it has a medical school. The adjustment removes this subject effect. A side-effect of this is that it is impossible to recalculate the total score in the Table using the published data, as you would need full access to all the raw data to be able to do that.


Apart from noting the overall position of any one university of interest, you can home in on a particular measure of importance to you like entry standards or graduate prospects. But bear in mind that this composite Table says nothing about specific subjects at a university and so should be scrutinised in conjunction with the Subject Tables and University Profiles.

Full detaills of how to use the league tables including:

  1. How the League Table works
  2. Student Satisfaction
  3. Research Assessment
  4. Entry Standards
  5. Student-Staff Ratio
  6. Academic Services Spending
  7. Facilities Spending
  8. Good Honours
  9. Graduate Prospects
  10. Completion
  11. Conclusions
In association with:
AGCAS      
Bookmark and Share