What phase one of the national Pre/Post-arrival Academic Questionnaire pilot has told us so far
Over the weekend HEPI published a blog on international tuition fee setting and a blog calling for a new academic integrity charter.
This blog was kindly authored by Michelle Morgan, Dean of Students, University of East London, Jonathan Neves, Head of Research & Surveys, Advance HE and Thandi Gilder, Researcher, AdvanceHE.
In February 2025, AdvanceHE, Jisc and the University of East London were awarded an Office for Students Innovation grant to run the first national pilot of a Pre/Post-arrival Academic Questionnaire (PAQ) across undergraduate and postgraduate taught levels of study. The bid was informed by the work of Dr Michelle Morgan, who has been undertaking PAQs for numerous years.
The PAQ, originally called the Entry to Study Survey, was formalised via the Higher Education Funding Council for England (HEFCE) funded Phase 1 Postgraduate Support scheme (2013-2016). The survey was designed to explore how to reenergise the UK domiciled postgraduate taught market, which had been in decline since 2010/11, by listening to and acting on the concerns and expectations of incoming students to help them navigate this level of study.
The purpose of the PAQ
The PAQ is not a national measuring tool like the NSS (which is used to evaluate UK higher education quality), and neither is it a marketing activity to provide information to prospective students. The PAQ is designed to be a pre- or post-arrival piece of coursework. Students are informed of the purpose of the coursework and are aware that by completing the PAQ they are consenting to the use of their information to enhance their experience and that of others.
There are numerous benefits for both the institution and ultimately the student, including:
- Taking entrants through a reflective learning journey to get them thinking about their upcoming studies, as well as providing parity of a meaningful pre-arrival and arrival activity across courses.
- Using the findings to inform real-time targeted support for incoming students based on what they told us. This helps us to understand how prepared students genuinely are, thus supporting the work of Outreach and Access and Participation Teams.
- Highlighting the similarities and differences between undergrad and postgraduate taught students, enabling focused support to be provided, and thus avoiding a ‘lift and shift’ approach that may not be suitable from one level of study to the next.
For the sector, some of the overarching objectives for this OfS EOO Innovation funded project include:
- To establish consistency in how the sector collects and acts upon information from students upon arrival around their learning styles, expectations, challenges and requirements.
- To provide robust data-led evidence to enable institutions to address inconsistencies in how different groups of students (for example, by social background, qualification type, geography and demographics) begin their learning and develop a platform to progress to good outcomes.
You can read more about the aims and objectives by going to: Equality in Higher Education Innovation Fund – Office for Students
Institutional and sector benchmarking reports
As with the PTES and PRES surveys, institutions can access their own results in real time via the Jisc platform and receive sector benchmark comparisons promptly after the survey closes, so they can see the differences and similarities in responses. As well as enabling them to tailor their support for their student body, it also creates awareness about the differences between students, institutions and mission groups. This can help inform not only institutional strategies, policy and initiatives but also national approaches.
More detailed benchmarking by type of institution and demographics is being published soon, which aligns with access and participation, continuation and possible risks to retention, NSS/PTES and learning and teaching satisfaction.
Phase one of the pilot and the delivery of the Pre/Post-arrival Academic Questionnaire
Phase one has provided the opportunity for participating institutions to explore the best method for them to deliver the PAQ and how they can use the data to best provide support to students and guidance and advice to staff. Some institutions decided to undertake it as a post-arrival activity in class instead of a pre-arrival one, which is one of several ways in which it can be run. Others targeted large courses instead of all students. The Conference for participating institutions on 23 April at the University of East London will collate case studies of experience and good practice that will be published.
Some brief headline findings
Phase 1 of the National PAQ Pilot has provided results that have confirmed what we know about our incoming students at UG and PGT levels, but has also challenged many of the assumptions made, especially about their prior learning experience, experience of Generative AI and expectations of higher education study.
Example 1
Information provided for applicants and new students is commonly provided via the typical UCAS applicant cycle. However, if you knew that 20% of your undergraduate students were coming through the clearing process because they had not applied through the main cycle whereas for the other institutions it only accounted for 10%, how would you change your pre-arrival and arrival and orientation information and activities for these students?
Example 2
Students are commonly advised that by engaging in sports, clubs and societies these activities will help them settle in and help develop a sense of ‘belonging’ at their university. However, if you knew that only 16% of your incoming undergraduates and 30% of your postgraduate taught students said this was important in their previous place of study, but instead it was connecting with staff and other students that were more important factors, how would you provide information and create activities within the course to help them settle in and develop a sense of belonging?
Example 3
After the Covid Pandemic, assumptions were made that this had increased students’ digital learning skills in schools and colleges as ‘teaching’ had been delivered online. However, if you knew that the main materials accessed in schools and college by undergraduates are still via handwritten/typed notes taken in class and a course textbook, that only 31% had experience of using a virtual learning environment (examples were provided) and that only 24% had experience of using a library to access electronic and hard copy learning materials in their last place of study, how would you change your orientation activities to embed library and use of VLE sessions?
Example 4
There has been an increase in mental health declarations via UCAS in recent years. However, if you knew that only 6% of Non-UK undergraduates and 5% of postgraduate taught student respondents stated they had declared a mental health condition, compared to 17% and 14% of UK respondents respectively, what support would you put in place to encourage and support international students to reach out for support?
Example 5
Unsurprisingly, preference is often shaped by prior experiences and what one is used to. So, if you knew that 49% of undergraduates preferred their feedback to be provided via individual in-person discussions with their teacher or tutor, how would you manage this expectation and preference, especially as at university, feedback is provided in numerous ways, although not necessarily made explicit?
Example 6
We know the cost-of-living crisis is resulting in more students needing to undertake paid work during their studies, but this is also an expectation on entry with 47% of undergraduates and 48% of postgraduate taught students expecting to undertake part-timework during term time. However, the reality for undergraduates once they begin their studies is around 68% (from the HEPI-Advance HE Student Academic Experience Survey) so what support would you put in place to help students manage these likely increased demands on their time?
Example 7
And finally, we assume that students are experienced in Generative AI. However, if you knew that only 65% of undergraduates and 71% of postgraduate taught students stated they had experience of Generative AI and it was primarily using ChatGPT and Grammarly to improve writing by correcting spelling or grammar mistakes, explore topics of interest, summarise information or text and explore concepts, how would you build into courses guidance on the use of Generative AI?
These are just a small handful of the informative findings that have been collected in this national pilot across 15 institutions. In the coming weeks, sector experts who are members of the National Pilot Working and Steering Groups will be analysing the data and producing blogs on the results in their specialist areas.
Phase one of the pilot has proved valuable for participating institutions.
We were delighted to participate in the PAQ pilot, which has yielded significant insights into the experiences of our latest cohort. While some data points confirmed our existing assumptions, the findings also highlighted several nuanced trends we hadn’t previously expected. The resulting report has been exceptionally well-received across the institution, and we are already moving to action these findings to ensure we are continuing to provide the highest level of support and encouragement for our students.
Clare Hutton, Data and Visa Manager, Norland College
Phase 2 next steps
A call for participation in Phase 2 will be taking place imminently. If you are interested, you can contact: Jonathan Neves on [email protected]





Comments
Add comment