This post follows my attendance at a HEFCE briefing on the Annual Provider Review process and related activities, which took place on 18th January 2017 in London.
The briefing provided the latest update on HEFCE’s work and the pilot activities currently being operated or designed. The value of the event was in the opportunity to speak to the various sector bodies who were given the contracts to deliver different parts of the quality assessment arrangements (i.e. the HEA, QAA, TSEP – The Student Engagement Partnership – and the Leadership Foundation), and in the Q&A session when HEFCE staff were quizzed on aspects of the process(es).
Annual Provider Review (APR) process
The information currently available about the process can be found here. I won’t reiterate the details of the process, other than to copy the following diagram from page 24 for reference:
At the briefing some clarity was provided about the ‘analysis of data patterns and trends’ input and the assessment/judgement stage.
HEFCE produces an APR data dashboard for each institution. The data set is similar to the TEF metrics, however there will be some ‘supplementary’ data on postgraduate taught masters students and ‘differences in degree outcomes’. HEFCE says this will add context and they will be careful to place any weight upon it in the assessment process. The PGT data will be split by characteristics (as in the TEF) and the data set will flag at 3 rather than 2 standard deviations (therefore more generous than the TEF). Flags will only be applied to negative data; this is because the process is about testing baseline standards rather than identifying excellence.
There was concern raised at the event about the transparency of HEFCE’s approach, as institutions won’t have sight of the data unless there are concerns raised by HEFCE that prompt discussions with the institution. HEFCE says it is experiencing some technical difficulties and in response to questioning it seemed they would consider sharing the data set in future iterations of APR.
APR assessment and judgements
The assessment is based on analysis of the data set, other intelligence and the assurances provided by governing bodies. Institutions have an opportunity to respond to any concerns raised by HEFCE and visits to institutions will be more structured with conversations recorded.
The HEFCE Quality Committee (Quality, Accountability and Regulation Strategic Advisory Committee (QARSAC)) making the judgements includes peer and student reviewers and there is no formula for how judgements are reached. HEFCE has omitted peer review from visits to institutions to avoid replicating QAA review.
Confirmation of the APR judgements will be sent to Heads of Institutions and Chairs of Governing Bodies in the spring of 2017.
Five-yearly HEFCE Assurance Review (HAR)
The five-yearly review visit is to ‘test the basis on which a governing body is able to provide assurances about the provider’s activities in this area’ (HEFCE, 21). The Revised Operating Model states that developments to the existing HAR process will be tested in the pilot period in 2016/17.
The update from the briefing event was that HEFCE is still working on reformulating the HAR process and the original timeline (January-July 2017) is slipping. For institutions selected for a visit between February and May the process will be the same as the existing version of HAR with the addition of conversations with governing bodies about the process for giving assurances on quality and standards.
Around 6-10 institutions are expected to be piloting the new HAR method from July 2017 into the start of next academic year. HEFCE’s Quality, Accountability and Regulation Strategic Advisory Committee (QARSAC) is to determine in February the precise timeline for the introduction of the new HAR method and a handbook will be produced in due course, perhaps by Easter.
HEFCE says that 2-3 months’ notice will be given of a visit and institutions will have to produce an ‘assurance map’ to show what steps were taken to secure the assurances from the Governing Body. The HAR visit team will be augmented by HEFCE Quality staff with peer review utilised at the committee stage where judgements are reached.
Overview of pilot activities during 2016/17:
The HEA was given the tender to lead a ‘sector-owned development process focusing on the professional development for external examiners’. This is a five-year project and the training will be voluntary (although it is cited in the quality assessment document that consideration of standards ‘should include… Confirmation of the appointment of a suitable range of external examiners, increasingly to be appointed in future from those who have undertaken training.’).
In 2016/17 pilot activities are being undertaken to test new approaches to training external examiners. All pilot activities are using a model whereby institutions train their own staff to act as external examiners. The first activities, starting today, are on blended learning/online formats and the second phase will look at face-to-face training.
In 2017/18 the training programme will be opened-up to ‘early adopters’ and institutions can sign-up to this. The HEA is considering the scalability of approaches and hopes for a tipping point whereby a sufficient number of institutions are using their training programme.
Further information on this and the work on the calibration of degree standards is here: https://www.heacademy.ac.uk/hefce-degree-standards
Higher Education Funding Council of England (HEFCE) (2016) Revised operating model for quality assessment [online]. Available at: http://www.hefce.ac.uk/media/HEFCE,2014/Content/Pubs/2016/201603/HEFCE2016_03.pdf [Accessed 20 January 2017]