Balancing the COVID-19 Disruption to Undergraduate Learning and Assessment with an Academic Student Support Package: Implications for Student Achievement and Engagement

In response to the COVID-19 pandemic-induced emergency pivot to online teaching and assessment, an Academic Safety Net was implemented at a regional Australian university to provide academic student support. Bayesian hierarchical models were used to compare student performance between 2019 and 2020. More students withdrew from subjects in 2020, while fewer students remained enrolled but failed. While there was no overall year effect for overall student achievement, exam achievement increased and on-course assessment achievement decreased in 2020. When achievement was analysed according to an assessment task change, a year effect emerged, with the magnitude and direction of the effect dependent on the task changes. The results indicate that the enrolment component of the Academic Safety Net was an effective equity measure that enabled students an extended opportunity to self-withdraw in response to general impacts of the pandemic; while the results component protected the integrity of results awarded during the emergency pivot.


Introduction
James Cook University (JCU) is a regional multi-campus Australian university. The undergraduate student population is diverse, with approximately half of the student cohort being the first in their family to attend university, and with substantial representation from low socio-economic status (SES) backgrounds and regional and remote areas. JCU's identity is that of an inclusive, teaching and research focussed university with the intent of "creating a brighter future for life in the tropics worldwide through students and discoveries that make a difference" (JCU, 2020a, p. 28). Reflective of this intent, the undergraduate course architecture is dominated by profession-based courses which are delivered internally. There are no minimum technical In response to the COVID-19 pandemic-induced emergency pivot to online teaching and assessment, an Academic Safety Net was implemented at a regional Australian university to provide academic student support. Bayesian hierarchical models were used to compare student performance between 2019 and 2020. More students withdrew from subjects in 2020, while fewer students remained enrolled but failed. While there was no overall year effect for overall student achievement, exam achievement increased and on-course assessment achievement decreased in 2020. When achievement was analysed according to an assessment task change, a year effect emerged, with the magnitude and direction of the effect dependent on the task changes. The results indicate that the enrolment component of the Academic Safety Net was an effective equity measure that enabled students an extended opportunity to self-withdraw in response to general impacts of the pandemic; while the results component protected the integrity of results awarded during the emergency pivot.
As a consequence of the declaration of the COVID-19 coronavirus  pandemic in early 2020, JCU instigated a oneweek teaching pause to transition most face-to-face learning, teaching and assessment activities, including end-of-semester examinations, to online delivery. Academic staff were provided with a selection of online assessment types to choose from, with each option accompanied by recommendations regarding suitability, authenticity, and implementation. Changes to assessment activities were approved through college-level governance processes, and were communicated to students via the Learning Management System (LMS) and distribution of approved, revised subject outlines. Universities across Australia implemented similar responses in accordance with government restrictions (QILT, 2021). While universities have effectively used online tools and resources to maintain communication and provide support for staff and students during natural disasters (Dabner, 2012), less evident is the success of rapid transitions to online curriculum in response to disasters. Transitioning from face-to-face to online learning and teaching takes time, resourcing and detailed planning (Keegwe & Kidd, 2010) and requires the development of new skills, changed teaching practices, redevelopment of curriculum, and multi-faceted support from various stakeholders (Bennett & Lockyer, 2004). In the absence of purposeful, pedagogically-based design for online learning, the rapid transition to online learning in response to the COVID-19 pandemic was initially referred to as "emergency remote teaching" (Hodges et al., 2020, p.1) in a deliberate move to confirm the lack of design, planning, and resourcing to undertake such. Sullivan (2020) warned that online teaching can exacerbate inequities in the enacted curriculum and argued that a critical pedagogy approach should be used for developments such as the move to an online teaching mode. It is likely that the emergency pivot to online teaching and assessment disproportionately increased hurdles to access and success with many JCU undergraduate students reliant on the on-campus provision of internet access and study-based infrastructure. Reported student experience of online exams is mixed. Increases in exam anxiety and concomitant reduction in student grades were reported by Woldeab and Brothen (2019). Milone et al. (2017) found that 89% of students using live proctoring in an exam were satisfied with the experience; notably these students were enrolled in online courses, so may be expected to have a suitable internet and hardware setup. In a study examining the experience of students completing online proctored exams during the COVID-19 pandemic, Kharbat and Abu Daabes (2021) found the majority of students had concerns over privacy, technological issues and home suitability, in terms of noise and distractions, for completing exams. Both technological preparedness and home suitability are likely to disproportionately affect students in equity groups. The majority of students interviewed by Kharbat and Abu Daabes (2021) found it difficult to concentrate on study during the COVID pandemic. Isolation resulting from lockdown, distraction in the home environment and disruption to face-to-face classes may also have increased disparity in terms of access to education for some students (Lowitja Institute, 2021).
As there was no way of predicting the impact of the emergency pivot on student performance, JCU implemented an Academic Safety Net (JCU, 2020c) in June 2020 to provide an institution-wide mitigation for unexpected disadvantages to student success arising from the lack of preparedness from staff in teaching online and for students with little preparation and resources for online learning and assessment. To reduce deleterious impacts of the emergency pivot on course grade point average (GPA) during the foundational years, students enrolled in first and second year graded subjects received ungraded passing results of satisfactory instead of pass/credit/distinction/high distinction. The ungraded results did not contribute to the course GPA, therefore course GPA after study period 1 (SP1) 2020 for these students remained at the GPA calculated at the end of the 2019 academic year. The non-GPA results were explained on the student academic transcripts as follows: Due to the special circumstances affecting students' study in 2020, many subjects have 'S' (Satisfactory) as the highest result available. Results of 'S' are excluded from GPA calculations. Students in years three and above may require grades for employment and postgraduate study applications, so students completing these subjects received graded results but could individually opt-in to non-GPA results, after receiving their results. Additionally, students at all year levels were withdrawn from any failed subjects before result release so that academic penalties of subject failure were removed. Deadlines for withdrawal from subjects without financial penalty (census date) was extended by three weeks, and self-withdrawal without academic penalty remained open until the commencement of the end of semester examination period. The Academic Safety Net applied to enrolments and results for subjects/subject chains that commenced after 1 February 2020 and before 27 July 2020. The Academic Safety Net Guidelines were approved by the Deputy Vice-Chancellor (Students) and published in the online institutional policy library. The Academic Safety Net was communicated to students through official email correspondence from the Director of Student Services, with simultaneous postings on the institutional COVID-response webpage for ongoing reference. The COVIDresponse webpage was regularly updated with FAQs on a range of matters relating to institutional operations, personal health and safety, and support.
The aim of this research was to determine whether, and to what extent, student academic performance was impacted by the COVID-19 pandemic and concurrent emergency pivot to predominantly online learning and assessment, as compared to the academic performance of the previous year's students. The findings may assist universities in determining whether or not compensatory academic support processes such as an Academic Safety Net should be implemented when reacting to future unexpected interruptions to operations at an institutional level. Students enrolled in future subjects are expected to benefit from the implementation of the recommendations for mitigating disadvantage while maintaining assessment quality during emergency pivot situations. Factors influencing changes to quality of assessment have been discussed.
The following research questions were explored: Did undergraduate student academic performance in subjects decrease in 2020 in comparison to the 2019 student cohort?
Did the proportion of students enrolling in and maintaining enrolment but not completing the subject requirements increase in 2020 compared to the 2019 student cohort?

Did on-course and final examination achievement show similar changes to overall achievement between 2020 and 2019?
Did the magnitude of change in student achievement between 2020 and 2019 differ across the various types of changes to examination modes and conditions? Did the implementation of the Academic Safety Net minimise detrimental effects on student academic performance outcomes?

Participants
Participants comprised undergraduate students enrolled in JCU's Australian tropical campuses in subjects owned by the College of Healthcare Sciences that were offered in SP1 of 2019 and 2020. There were 130 subjects offered in the College in SP1. Postgraduate and sub-degree subjects were excluded from the study (n=30). Due to the nature of the comparison between the years any subjects with changes in the assessment profile (assessment genre and/or weighting) between the two years would comprise an invalid comparison and so were excluded from the analysis (planned assessment changes n=8; changes due to COVID-19 n=5). Subjects without a GPA grade and chained subjects would provide no achievement data and so were also excluded (n=40). New subjects and subjects in teach-out provide no comparison of the achievement of students over the two years and were excluded from the analysis (n=9). Thirty-eight subjects provided a valid comparison between the years, and were included for analysis.

Research Design and Method
Approval from the institutional privacy officer and the institutional Human Research Ethics Committee to use archived student data was obtained for this study (Human Research Ethics #H8257). This study involves retrospective census type quantitative analysis of interval data. The entire cohort of student results were compared for all subjects meeting the inclusion criteria for 2019 and 2020. Final subject results were accessed from archived College assessment meeting files. Student achievement was summarised with the following calculated statistics for each student: overall subject achievement; on-course achievement; final exam achievement; practical exam, Viva exam and Objective Structured Clinical Examination (OSCE) achievement. Students who withdrew from subjects before results finalisation were not included in the archived data. The change in mode or conditions of assessment tasks due to COVID-19 were captured with a categorical variable at subject level. The possible categories for "task change" depended on the types of assessment present and the changes that were made in response to COVID-19 (see Table 1). Online exams denote a change from a face-to-face invigilated written exam to an online test within the LMS with open book conditions. Browser exams comprised an online test within the LMS with browser control software deployed, enabling open book conditions but the computer is restricted to only display the test. Browser plus camera exams comprised an online test within the LMS with browser and webcam control software deployed, enabling closed book conditions as the computer is restricted to only show test and the student is recorded during the exam via webcam. Video recordings are saved to the LMS and reviewed by staff. Timed assignment exams were time-limited open book assignments within the LMS during the exam period. On-course assignments involved the replacement of the exam with an assignment during semester.
Browser + camera (4) Browser + camera (4) No change (1) 5.0.0 Timed assignment (5) None (0) None (0) 5.5.0 Timed assignment (5) Timed assignment (5) None (0) 6.0.0 On course assignment (6) None (0) None (0) The student names and identification were replaced with a 10-digit alphanumeric code that is unique to each individual, yet cannot be backwards calculated to re-identify the student name or ID. This non-reversible, SHA-256 hash conversion was performed by running code with the "deidentifyr" package (Wilkins, 2020). This method allowed for students completing more than one subject to be recognised as a source of variation within the analysis. As a significant portion of the total variation was expected to be student driven, this substantially improved the power of the analysis to recognise effects of year and task change within the combined student results in subjects.

Statistical Analysis
Bayesian hierarchical models (Gelman et al., 2013) were used to investigate the interactive effects of year and, either task change category or student year level on student completion rates, pre-and post-census student withdrawal rates, student performance in final written exams, on course assessment, practical exams, Vivas and OSCEs and for overall subject achievement scores. More specifically, all models included population level effects of year crossed with either task change category or student year level nested within Subjects. Models that explored effects on completion and withdrawal rates were modelled against a beta-binomial distribution, whereas models that explored effects on student scores also included the additional varying effect of student so as to accommodate the individual variation amongst students and were modelled against a beta distribution (logit link). Hierarchical models provide a framework for analysing designs in which some levels are nested within other levels (e.g. students nested within subjects) whilst accounting for differences between subjects and students as well as the dependency structures introduced by repeated measurements of subjects and students. All models were fit using the brms package (Bürkner, 2017) with three Markov Chain Monte Carlo (MCMC) chains, each of which had 3000 iterations, a burnin of 1500, thinned to a rate of 5 and weakly informative priors were chosen to improve sampler stability and help constrain parameter estimates to within sensible domains. MCMC diagnostics suggested that all models converged on stable posteriors (R hat values < 1.02). Typical regression model assumptions (pertaining to the structure of residuals) were explored using a combination of residual plots and Q-Q plots. All analytical procedures were completed within the R 4.0.4 Statistical and Graphical Environment (R Core Team, 2021).
Effects were calculated from the posterior likelihood and expressed as the median absolute (+/-95 highest probability density credible interval) difference between years for each task change category separately, as well as pooling (averaging) over task change (weighted proportional to number of students). Exceedance probabilities (ExceedP), the proportion of posterior draws for a specific contrast that exceeded 0, were also calculated and interpreted as the degree of belief that the associated effect was greater than 0. As such ExceedP values greater than 0.8 are considered weak evidence of an increase, and those greater than 0.95 are strong evidence of an increase. Similarly, ExceedP values smaller than 0.2 are weak evidence of a decrease and those less than 0.05 are strong evidence of a decrease. All data preparation and analysis codes can be found at https://github.com/pcinereus/covid19_assessment

Results
A range of changes to assessment were used across subjects in response to the emergency pivot, although with variable frequency. Because cohort size varied across subjects, these factors resulted in an unequal distribution of subjects and students across the task change categories and for different year levels. In 2020, the total number of student results in the 38 included subjects was 4299, compared to 5238 in 2019, and this reduction occurred for all task changes (range across categories of 2020 as a % of 2019: 54-86%) except for the two 4th year subjects included in 2.0.2, where student numbers were 1.5 times greater in 2020 than 2019. Over 55% of results were in the 11 subjects (mostly at 1 st and 2 nd year) that had no change to assessment task; approximately 20% of results were for task change 4.0.0 which comprised 1 st , 2 nd and 3 rd year subjects, and the remaining 25% of student results occurred across the other nine types of task changes also including 1 st , 2 nd and 3 rd year subjects.

Figure 1
Student achievement for each task change category in 2020 and 2021 Notes: (a-d) The mean and 95% credibility intervals for (a) total achievement (%), (b) exam score (%), (c)on-course score(%) and (d) Prac, Viva and OSCE score (%) in 2019 and 2020 for each task change category; (e-h) Change in (e) total achievement, (f) exam score, (g) oncourse score and (h) prac, Viva and OSCE score (2020-2019) and 95% credibility intervals for each task change category.
Total achievement of students in the 10 first year subjects increased (Effect size=1.4% ExceedP=1.0). There was no evidence of change in achievement for the 11 second year subjects ((Effect size=3.2% ExceedP=0.77). Total achievement across the 13 third year subjects decreased ((Effect size=-1.5% ExceedP=0.0). Fourth year subjects were represented by two subjects in one task change category-2.0.2.

On-Course Achievement
There was an effect of year for on-course achievement pooling over students, with lower scores in 2020 (Effect size=-1.4% ExceedP=0.0

Practical Exam, Viva Exam and OSCE Scores
Task change categories showing evidence of increased score on practical, Viva and OSCE exams comprised only 4.0.1 (Effect size=1.5% ExceedP=0.86) and 2.0.2 (5.0% ExceedP=0.93). Weak evidence of a decline in Viva exam scores was noted in task change category of 1.0.1 (-1.7% ExceedP=0.18). The remaining practical, Viva and OSCE exams categories showed no change in student performance.

Safety Net Withdrawal Date Extension
The decline in the number of student results between 2019 and 2020 is likely due to a combination of expected and unexpected events in 2020. An expected, substantial decrease in 2020 first year enrolments to university resulted from the reduced cohort graduating from Queensland high schools in 2019 (Queensland Audit Office, 2021). Pre-census withdrawals also account for the decline in cohort sizes at subject completion, particularly as the census date and date for withdrawal without academic penalty in SP1 2020 were extended at JCU as part of the Academic Safety Net. In the college, pre-census and post-census withdrawal (either voluntary or as a conversion of a failing result under the Academic Safety Net) rates increased, while the proportion of students who remained enrolled but failed to complete subjects (submit all assessment) remained steady in 2020. This observation indicates that the Academic Safety Net parameters of extended timeframes for census and for withdrawal without academic penalty, and conversion of completed but failed results to withdrawals without academic penalty, in recognition of additional hurdles to study and success as a result of COVID-19 impacts, was an effective equity measure, mitigating against COVID-19 related impacts on GPA and subject fee accrual. Increased withdrawal from subjects is consistent with increased student experience of hurdles to university engagement as a result of COVID-19. Increased difficulties relating to personal circumstances, mental health challenges, financial challenges, childcare and homeschool commitments, in addition to study-specific challenges during COVID-19 have been widely reported (e.g. Baticulon et al., 2021;Lyons et al., 2020;Pokhrel & Chhetri, 2021). Although most classes in the college switched to online, lockdowns enforced in some Australian states were not deployed in Queensland (Parliament of Australia, 2020), so students were able to access the university spaces such as the library. For example, despite a drop of 47% for in-person library use, over 400,000 in-person visits took place and this was supplemented by a 44% increase in online library visits, indicating success for the continued access to and provision of support to students (JCU, 2020b). This continuation of access to on-campus study spaces plus the expansion to online access and support was reflected in the increase from 77% to 79% (2019 to 2020) for student support evidenced in national survey data by JCU students, contrasting the national average of 74% across both years. Despite the positive impact on student support, JCU and sector-wide QILT data, indicates that students experienced lower levels of engagement with their studies, teachers and peers (QILT, 2021), likely due to the emergency pivot to online learning, noting that the JCU decline was less than the overall sector (JCU 63% down to 51%, sector 60% down to 44%).

Non GPA Grades Component of Safety Net
The lack of overall effect on student achievement in 2020 was a result of competing trends within subsets of the data. Increases in total achievement were seen in five categories, decreases in one and no evidence for change in the remaining categories.
Competing trends were also seen between on-course and exam achievement. The lack of overall trends in total achievement may be explained by increases in exam score counteracting decreases in on-course achievement for some task change categories, in addition to different trends across different task change categories. While most students would not have received a lower grade had the safety net not been introduced, reduced mean achievement at an effect size of 1.1% seen in task change category 1.0.0 would have resulted in some students receiving a result in a lower grade band and therefore a decreased course GPA. As an equity measure this turned out to be needed by fewer students than the other parameters in the Academic Safety Net. The increased total achievement reported for several task categories, suggests that the Academic Safety Net also had the effect of protecting the integrity of subject results awarded during COVID-19.

Comparative Effect of Types of Assessment Change
The different patterns seen in different task change categories, and between on-course and exam achievement reflect the complex nature of COVID-19 effects on assessment preparation, facilitation and performance. Reduced on-course performance in several categories, and reduced total achievement and exam score in the task change 1.0.0, which had no changes to assessment due to the lack of invigilated exams in these subjects, is consistent with the expected hurdles to learning and teaching during the COVID-19 affected semester. These results indicate that student performance was affected by general COVID-19 conditions and/or teaching of content as the assessment was unchanged in terms of weighting, mode and scope and therefore workload requirements and expectations.
The generally consistent nature of the practical, Viva and OSCE exam results may be attributed to the re-introduction of faceto-face practical classes and related assessment with reduced class sizes. So the students, while initially disrupted by lack of classes, eventually received similar tuition to 2019, albeit with higher staff to student ratios and after completing the theoretical components that were taught online and on-schedule. The delay in practical classes and the associated assessment may have allowed for students to assimilate the theoretical knowledge before attempting to its practical application. Task change category 2.0.2 bears individual discussion, as the increase in achievement was marked, with an effect size of 5%. This category comprised two closely related subjects and the practical assessment was changed from face-to-face to online: students watched a video and made written responses, rather than making verbal responses and physically demonstrating skills on request. The increase in results for the practical exam, in conjunction with the lack of effect in on-course achievement, suggests that the students did not have greater acquisition of the subject content, rather that this exam format has reduced validity. Use of this format requires careful consideration of difficulty of the task, moderation of responses and alignment of the skills demonstrated so that the exam does assure the intended learning outcomes of the subject.
In contrast to the reductions in on-course scores and steady achievement in practical, Viva and OSCE exams, written exam achievement increased, with large effects in some categories. This implies that exam design was less effective in 2020, and exam validity and/or security reductions allowed increases in score that may not reflect student knowledge and skills. Interestingly, the large improvements in exam score were not evidenced in subjects which changed from invigilated, closedbook conditions to open-book conditions, but those that changed to timed assignments and those that employed the web browser and camera control software. Staff who moved exams from closed-book, invigilated conditions to open book conditions (ie categories 2.2.0, 2.0.2 and 3.3.0) were cognisant of the ramifications of changes and redeveloped the exam questions to emphasise application and synthesis in an open book environment. In contrast, those subjects that used webcam and browser software to seemingly enforce closed book, invigilated conditions did not appreciably change exam question types. The increases in exam score, and larger effect size when mid-semester exams also utilised this software, raise concerns for the security of exams in these conditions, where invigilation via the lockdown browser and the webcam was assumed but not assured. The timed assignment exams (categories 5.0.0 and 5.5.0) also showed a larger effect when mid-semester exams were also changed to timed assignments. As on-course achievement decreased in both these categories, markedly in the case of 5.0.0, the elevated exam achievement is likely not reflective of increased student articulation of knowledge and skills of subject content. The exam design is possibly less valid in terms of question type or timing of the exam tasks and merits further investigation.
Overall, the results of this study provide mixed support for the specified research questions. While there was no overall year effect on subject-based academic performance when all student results were pooled; there were significant year effects of varying magnitude and direction when performance was analysed according to the assessment task change category. Furthermore, while exam performance decreased, on-course assessment performance increased in 2020, indicating a complex pattern of contributing factors related to content delivery and learning, assessment practices, student support and engagement, and more general impacts of COVID-19 on lifestyle. The Academic Safety Net non-GPA grading system nullified the year effect on passing subject performance and thus protected the student course GPA and protected the integrity of results awarded by the College, noting however that this whole-of-cohort strategy was not implemented across higher year levels. The Academic Safety Net enrolment support parameters of extended Census and withdrawal dates and of converting 'completed but failed' results to withdrawals also benefited students through the provision of no/low academic and financial penalties for late enrolment decisions and poor academic performance, which coincided with fewer students remaining enrolled but failing due to non-completion. The enrolment-based component appears to have greater beneficial impact than the results-based component of the Academic Safety Net.
The authors provide the following equity-based recommendations for universities to academically support students and protect assessment integrity during continued COVID-19 impacts, and for future disaster declarations that require emergency pivots to teaching and assessment, and therefore to learning: i) provide avenues to withdraw late with minimal/no academic and/or financial penalty in recognition of the complex factors that impact students during unexpected disasters; and ii) carefully consider assessment design and examination conditions to optimise assessment integrity and assurance of learning; and iii) consider whether result-based safety net parameters should be implemented across all year levels and all types of subjects as a blanket equity approach, or whether a nuanced approach is warranted to optimise student support and future opportunities for success.
While this study provides observations and recommendations based on rigorous statistical analysis, the findings are limited to undergraduate health-based subjects that meet strict inclusion criteria and as such may not be representative of the whole-ofinstitution impact of the Academic Safety Net, and may not be transferable across all assessment genre. Furthermore, the study is limited to retrospective assessment scores and results and as such does not capture the views or experiences of students beyond what can be evidenced from the data presented.

Conclusion
Changes in student enrolment and performance that occurred between 2019 and 2020 were likely caused by the emergency pivot to online teaching and assessment, and the Academic Safety Net provided an institutional mechanism to address this impact. The enrolment components of the Academic Safety Net provided students with an extended opportunity to selfwithdraw without penalty; and the results component protected the course GPA of students while simultaneously protecting the integrity of awarded results to minimise unintended consequences of the unplanned and un-designed emergency pivot. The authors recommend that universities consider using similar support mechanisms in the event of future disaster declarations, while highlighting the need for careful consideration of the assessment design and examination conditions to optimise integrity and authenticity of assessment.