Update on the degree algorithm project

Two blog posts for the price of one Academic Registrars’ Council (ARC) Quality group event (see also Developments in the training of external examiners): – this update concerns Universities UK (UUK)’s project with GuildHE researching the algorithms used by the higher education sector to calculate degree classifications.

The project was initiated by HEFCE via the Revised operating model for quality assessment (HEFCE, 2016) on the basis of “student interest issue in the cliff-edge effect of the current classification system for undergraduate degrees” (HEFCE, 2016, 38). The HEFCE publication suggests that it is “necessary to acknowledge and address the impact of the wide variety of classification algorithms used across the higher education system” (HEFCE, 2016, 38-39) and goes on to note the Higher Education Academy (HEA)’s findings that:

47 per cent of institutions surveyed had made changes to their degree classification algorithms over the past five years, to ‘ensure that their students were not disadvantaged compared to those in other institutions’. (HEFCE, 2016, 39)

UUK/GuildHE won the tender to deliver this project and the aims and objectives were as follows:

Aims:
  • Explain existing practice and trends in relation to the design of degree algorithms
  • Support institutional decision-making on algorithms
Objectives:
  • Look at the range of models employed by the sector
  • Assess whether there are trends that may undermine wider confidence in degree standards
  • Consider whether concerns about threshold effects at degree boundaries are influencing the types of algorithm being employed

The project is currently in the write-up phase, with analysis being undertaken of the results of a survey sent to all institutions. The final report is due to be completed next month and published in September 2017.

Outline findings

The results were in the process of being verified but some provisional outcomes and graphs without numerical values were shared with the group. Some of these were as follows:

  • Around 10-15% of institutions did not model to their student body the impact of making changes to their algorithm
  • The algorithms for around 30% of institutions were influenced in some way by Professional Bodies (PSRBs)
  • Most institutions do not have plans to introduce Grade Point Average (GPA), although some are considering it (this is unsurprising given the lack of an emerging consensus around one version of GPA)
  • Most institutions calculated degree outcomes using average marks, weighted by year. An illustrative list of the weightings used by institutions in England and Wales was shown and provides interesting reading (weightings are shown as 1st:2nd:3rd year of a three-year undergraduate degree programme (e.g. 0:20:80)):
UUK Degree Algorithms initial report - Q17 relative weightings

The rows highlighted in yellow are institutions that include the first year (of a three-year degree) in the calculation (which, incidentally, is what most GPA formulas also do). These institutions appear to be in the minority in taking this approach and is in stark contrast to the institutions grouped together in the top row, where the calculation is solely based on the final year. The most common weightings are those that use the 2nd and 3rd years in the calculation, with a higher weighting given to the final year, e.g. 40:60 (so-called ‘exit velocity’ – Ed.).

Initial ‘key findings’

These were the key findings identified in the UUK presentation:
  • The impact of changes to award algorithms might have been overstated
  • Changes to award algorithm are less frequent than thought
  • Award regulations are reviewed frequently
  • More than a fifth of institutions made no changes to their award algorithm at the last review
  • A number of institutions (14) making changes in response to comparator or competitor practice

My take

The report will be newsworthy when it is published in September, however, the sector may view it as advisory and I doubt whether it will have any profound effect on institutions’ decision-making when it comes to award regulations and algorithms. This is in part because the output will be framed as guidance but also because, based on my impression of the initial findings presented, the qualitative research undertaken may have led to some imprecise results due to the way questions were interpreted. For example, the terms ‘award regulations’ and ‘award algorithm’ are used almost interchangeably and the findings showed that although algorithms (i.e. the actual calculation) may not have been changed, other factors that impact on the algorithm were changed more frequently (example: Q9 is ‘Why did your institution make these changes to the award algorithm? If your institution made no changes to the award regulations, please set out the reasons why.’).

The survey can be found here: https://www.surveymonkey.co.uk/r/KPNNPNN.

How does this fit with the HEA’s finding that 47% of institutions changed their degree classification algorithms?

Although it is couched as guidance, what is not yet clear is whether one of the ‘obligations’ in the Revised operating model for quality assessment as to how institutions should consider standards issues will be followed-up, namely; “Confirmation of the use of guidance on acceptable algorithms for calculating degree or grade classification boundaries where these are available, or else to confirm why they are not being followed.” (HEFCE, 2016, 40) Will, for example, HEFCE’s five-yearly assurance visits specifically challenge institutions on their use of the guidance?

And finally, looking at the algorithm calculation alone is insufficient to gain an understanding of what contributes to degree outcomes. This is because students’ results are impacted by a wide range of factors, including; the number of resit and reassessment attempts available; approaches to mitigating/extenuating circumstances; the use of compensation and/or condonement; and marking schemes. It will be important for the report to recognise these factors to avoid narrowing the debate.


References:

Grove, J (2015), ‘Half of universities have ‘made changes to degree algorithms” Times Higher Education [online]. 11th June 2015. Available at: https://www.timeshighereducation.com/news/half-universities-have-made-changes-degree-algorithms [Accessed 18 June 2017]

Bourke, T (2016), ‘Degree standards: doing what is says on the tin’ HEFCE blog [online]. 13th April 2016. Available from: http://blog.hefce.ac.uk/2016/04/13/degree-standards-doing-what-it-says-on-the-tin/ [Accessed 18 June 2017]

Higher Education Funding Council of England (HEFCE) (2016), Revised operating model for quality assessment. Available at: http://www.hefce.ac.uk/media/HEFCE,2014/Content/Pubs/2016/201603/HEFCE2016_03.pdf [Accessed 16 June 2017]


 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s