- This important blog responding to the Office for Students’s first reports on quality assessment visits has been written for HEPI by the Vice-Chancellor of London South Bank University and Chief Executive of LSBU Group, Professor David Phoenix.
- For information about the events that HEPI is hosting at the forthcoming party conferences, see our Events page. These are open to all and outside the secure zones – meaning you do not need a conference pass to attend them.
The Higher Education and Research Act (2017) is intended to be transformative. It introduces a risk based approach to quality oversight which is data informed and balanced with reviews where appropriate. Its ambition is to protect student interests and the reputation of the sector, whilst also allowing the number of providers to grow. Its intention was to also reduce regulatory burden on those institutions that meet threshold criteria.
At LSBU, we have been supportive of a risk based approach so when, in May last year, the Office for Students (OfS) said we would be one of eight providers whose business and management subject area would be reviewed, we were not too perturbed. We were confident of our offering and today’s announcement from the OfS shows this confidence was not misplaced.
We are delighted that the report says there were no concerns about the quality of our provision. It highlighted many areas of good practice, and how we meet the needs of our diverse student body.
The academic experts who made up the review team saw, first-hand, the ‘good rapport that our academics have with our students’ and the way that our teaching staff have ‘created a supportive environment for our learners’. Furthermore, the OfS review team recognised the long-term impact of our teaching, with up-to-date teaching and learning alongside a ‘conscious and coordinated approach to integrating employability into the curriculum’.
It was especially gratifying to hear that our real-world approach to teaching and learning means that ‘many assessments on modules are now explicitly employability-focused, offering students the opportunity to replicate tasks they might find themselves undertaking in the workplace’.
We appreciated how much time the assessors took to listen to and understand the needs of our students. The process also helped staff reflect further on their pedagogic and disciplinary approach and had some similarities with the Quality Assurance Agency subject reviews for those who have been in the sector long enough to be familiar with them.
But we do have some concerns about the experience. These are linked to a lack of transparency and the additional burden this creates as well as the media attention that was brought to bear on what should have been be a routine event. With additional sector reviews now also undertaken in computer science and in relation to grade profile, an increasing number of providers have experienced this assessment approach and it seems timely to provide some reflections on the process
When we were first informed that the OfS intended to assess our courses, we were keen to understand the basis for this investigation – especially as it was launched before the B3 (baselines for student outcomes) consultation concluded and indeed the OfS had indicated it was not therefore based on B3 indicators.
Throughout the process, we found the regulator resistant to providing a clear and transparent answer to this question. Its response was always that the investigation was ‘in line with the OfS’s powers under the Higher Education and Research Act 2017, which include the ability to proactively investigate whether or not a provider is complying with conditions of registration and/or whether there are any wider concerns that may warrant regulatory intervention’.
It is evident in our final report that there is still no clear explanation for why we were selected – with the report simply summarising the broad basis upon which the OfS has powers to initiate an assessment of any provider. We therefore found ourselves entering (and, as it has transpired, exiting) this process without any clear statement from the OfS.
Consequently, we had to prepare for the assessors potentially wanting to review every aspect of our provision. This necessitated high levels of preparatory work from colleagues across the institution given there was a request for multiple materials as well as access to the virtual learning environment and an expectation that alongside student and staff meetings there would be teaching observations. We felt it was unreasonable not to be given a clear idea of the initial lines of enquiry at an earlier stage as it left everyone feeling this was more of a ‘fishing trip’ than a review taken against a clearly defined risk-based framework. This was uncertainty was exacerbated by the fact that the OfS had not recruited and trained expert assessors at the time we were notified. This meant it was several months before we were given any indication as to what the team might be interested in seeing and how they would proceed.
The process itself did, however, have several positives. Once the team were in place, we found them receptive to our initial feedback on the organisation of their visits.
The assessors took a rigorous and constructive approach, encouraging us to provide evidence for the things we said, and giving us fair opportunity to do so. It was also good to see that the team included assessors from a range of institutions, reflecting the wider set of approaches to pedagogy in the sector.
But at the conclusion of our visits, we were once more subjected to a lack of clarity. Unlike an Ofsted inspection, where the lead inspector will offer school leaders a summary of their observations and provisional outcome, we were told we would hear from the OfS “in a few months”.
We were unclear not only when we would hear, but what any assessor report might look like, how long we would have to review it, when the OfS response would be made and what range of outcomes it might contain.
During this time, we had to reassure staff who felt that they were under an uncertain and unclear scrutiny, and also support our student leaders and those students who were part of the assessment process. That we were able to do this without too great a fracture of staff and student morale is testament to the strong foundations that our business provision has. But it took its toll and there were colleagues who found this highly stressful.
Even now as the report is published, press releases from the OfS makes it clear they are yet to come to a judgement, saying ‘[w]e will now carefully consider their findings as we decide whether any further regulatory action is appropriate in individual cases’.
A final reflection on this process – and the news attention that has surrounded these assessments – is that providers are already very used to such scrutiny. Professional and regulatory bodies (PSRBs) have their own approaches to course review and validation, and in many instances the level of scrutiny can greatly exceed that of the OfS.
During such reviews, providers will routinely be given recommendations for areas to develop or enhance, and these recommendations are announced without any media fanfare. For example at LSBU, the majority of our students study on accredited courses and we work with over 40 PSRBs on our undergraduate provision alone, with many more for our diverse postgraduate provision. When we are assessed, the reports generate useful insight and recommendations that lead to further course innovation, to the benefit of our students.
Issues around the burden of regulation have been discussed extensively in recent days but from our experience, one of the most significant implications of these visits has been the uncertainty and the related impact on the wellbeing of staff, compounded by the stories in the press on how reviews will find low quality provision. We have of course seen significant national discourse regarding the impact of Ofsted visits on the wellbeing of teachers and school leaders, and a commitment from Ofsted to do more to minimise the stress and uncertainty associated with some aspects of their inspection framework.
We know the OfS investigation has been difficult for everyone involved in this process, but it did not have to be this way, and we worry that there is a risk that the stakes have been raised too high. The new framework has much potential and, if the OfS can increase clarity and transparency (as the B3 work is beginning to do), it will be of great benefit for the sector, students, and colleagues who could approach the process without the anxiety it currently generates. It has to be recognised that an inspection team can review all aspects of provision but it is not unreasonable to expect information on the triggers. We also need to counter the current media hype around such reviews – as reviews like this are a core part of the new framework and I hope that in writing this it helps others feel more able to engage in constructive and open review that we are used to as a sector based on a model of learning enhancement.
It may be that many of the challenges we faced relate to teething problems with a new framework but we hope the OfS takes our feedback on board in the spirit of collegiality and that our experience helps to refine the process for other institutions.