The Power of the Nudge: Technology Driving Persistence

Providing timely nudges to students has been shown to improve engagement and persistence in tertiary education. However, many studies focus on small-scale pilots rather than institution-wide initiatives. This article assesses the impact of a pan-institution Early Alert System at the University of Canterbury that utilises nudging when students are at risk of disengagement. Once flagged, students received an automated text message and email encouraging re-engagement with the learning management system. Students who received the nudge re-engaged at a higher rate and spent more time engaging with online material. These benefits were sustained over two weeks, demonstrating a measurable benefit over time. Unexpectedly, the nudge resulted in persistence and engagement in other enrolled courses where a nudge was not provided, showing the transferability of benefits to other courses. Although no significant differences in GPA were found between test and control groups, future development will enable further research.

The collection of LA within higher education enables the use of learner-produced data and analytics to model outcomes which can be used to advise students in their learning. LA data is automatically tracked and recorded during students' use of the LMS as they engage with the content of their courses (Lawrence et al., 2021). Monitoring student engagement with the LMS allows stakeholders to use data to improve learning and teaching practices and make informed decisions around educational goals (de Barba et al., 2016). Additionally, LA data allows institutions to quickly identify at-risk students and adopt institutional strategies to improve student outcomes (You, 2016). Such initiatives are in operation in many tertiary institutions and can effectively curtail attrition and improve student success (Faulconer et al., 2013;Jayaprakash et al., 2014;Tampke, 2013). However, many LA projects are focused on departments or faculty in small-scale pilots. An evaluation of LA publications found that many describe use cases and have only targeted online courses such as MOOCs (Leitner et al., 2017)-this is understandable due to the large volume of data collected in these courses. Institutional culture change is often a significant obstacle to the successful implementation of LA programs (Macfadyen & Dawson, 2012); buy-in to scale up within an institution requires the support of leadership to overcome barriers to change and reduce resistance.
An essential component of LA programs includes using a communication initiative to act when students are identified as requiring follow-up. Nudges, a timely and intentional communication strategy encouraging the target towards a particular behaviour or action, have impacted behaviour change in multiple scenarios (Caraban et al., 2019;Thaler & Sunstein, 2008). A recent review indicated a limited understanding of the long-term effects of nudging since most studies included in the analysis did not assess whether the benefits of the nudge are sustained after its removal (Caraban et al., 2019).
Tertiary education institutions have tried to leverage nudging to improve student outcomes with variable results. After a fiveyear effort to improve tertiary achievement, Oreopoulos and Petronijevic (2019) concluded that the impacts of nudging students, even paired with other interventions such as one-to-one coaching, were not large enough to result in improved academic benefits. Others have found that nudges are only effective when focusing on aspects with high consequences for inaction (Page et al., 2020). Additional research has seen positive results. For example, nudges encouraging students to engage with activities and resources have been found to influence student engagement and persistence within courses (Brown et al., 2023;Lawrence et al., 2021;Karaoglan Yilmaz & Yilmaz, 2022;Nelson et al., 2012). However, there can be difficulties in measuring the influence of LA programs, mainly when the program is part of a broader suite of initiatives in the institution to improve success (Sclater & Mullan, 2017). Such is the case in the authors' institution, where no measurable baseline existed before the intervention initiative started. Nonetheless, it is possible to assess the impact of automated nudge messaging on engagement by investigating post-intervention LMS data and final grades within courses to garner insights, which is the focus of this article.

Analytics for Course Engagement at UC
The University of Canterbury (UC) began its LA program in early 2020, primarily supporting first-year students as they transitioned into the tertiary environment. This program, dubbed 'Analytics for Course Engagement' (ACE), consists of two components: an alert system that identifies students at risk of disengaging and a response, a multi-level outreach workflow involving many areas of the university ecosystem. ACE proved vital as New Zealand went into lockdown during the COVID-19 pandemic; the system allowed UC to provide targeted outreach to students who may have been struggling. ACE monitors online student engagement with UC's LMS, capturing both log-on and interaction information, and students flagged at risk of disengaging are proactively supported via an extensive organisation-wide response plan. This human-centred workflow involves student advisors, academic staff, halls of residence, and support services who actively follow up with students who fail to reengage with their courses online. ACE aims to increase retention rates across the institution, particularly within equity groups that have traditionally experienced lower success rates, and to connect students to appropriate support services if necessary.
Since 2020, both student-and teacher-facing LA dashboards have been developed. Each student has a personalised dashboard embedded within the LMS. The dashboard provides students with a score for their recent engagement to encourage selfdirected engagement and learning. Prior research has shown that their peers can influence the amount of effort tertiary students dedicate to their academic studies (Sacerdote, 2001). Therefore, as well as enabling self-direction in study behaviour, students can monitor their engagement relative to peers in the same course to provide context to their learning journey. This is particularly critical for those students who are the first in their families to attend tertiary institutions. In addition, teachers can further their understanding of individual and collective student engagement and how students engage with the online material supplied to them via the LMS. The program has since expanded to target all undergraduate students across the institution in support of legislative requirements around pastoral care (NZQA, 2021). ACE flags students if they have not engaged with their course in Moodle, UC's LMS, within the last seven days. This process runs weekly during term time, and first-year students can be flagged for non-engagement in one or more enrolled courses. In contrast, all other undergraduates are flagged for non-engagement only when they have entirely disengaged from all the courses they are enrolled in that semester. These students are uploaded into UC's student management system (SMS), which triggers an automated nudge; a mobile text message and email. 1 Students are only flagged once per term to avoid the automated nudge message becoming a nag. Where students are flagged for multiple courses, only one message is sent to reduce the burden on students.
Students are then monitored for activity or responses the following week to see whether they have either replied to the outreach messaging or reengaged with their respective courses online. Students who fail to respond or reengage within the LMS are moved to the next workflow stage (see Figure 1). Students move through each step of the workflow until they respond to the contact or re-engage with their courses as measured by online engagement.

ACE Workflow Following a Student Being Identified as Disengaged
UC has an undergraduate population of around 15,000, with approximately 4,000 first-year students. ACE touches around two-thirds of first-year students and about 15% of students within other undergraduate levels at some point each year. Around 50% of students re-engage with their courses via the LMS after the initial text message and email, reducing the need for resource-heavy human intervention. Teaching staff at UC are cognisant of ACE and the process which is used to follow up students for non-engagement to ensure students aren't repeatedly contacted. Staff are still able to provide contextual messaging about resources and assignments via direct email from the LMS should they wish to do so.
While around 50% of students who receive a message from ACE re-engage, we wanted to understand the extent to which the nudge intervention influenced students' re-engagement with their courses within the LMS. Some students will likely re-engage on their own accord without the nudge. It was anticipated that students who received the text message and email would reengage with their courses significantly faster than those who did not receive the intervention.

Project Design and Method
We considered it unethical to withhold the ACE outreach to first-year students to assess the nudge's impact via test and control groups, as these students are targeted on a course-by-course basis. Therefore, undergraduate students in second-year courses were targeted due to their higher threshold for inclusion in the regular ACE outreach process. This project was deemed exempt by the university's Human Research Ethics Committee. 2 Ten second-year courses were selected for their reach across major undergraduate programs at the institution and their high enrolments. The researchers approached the course coordinator for each course to explain the process, and all agreed to be involved in the project. Course coordinators were responsible for informing their students that they may receive messaging regarding their engagement as part of a university-led initiative. This took the form of an email or a post on the respective course page in the LMS to the students. As this project focused on the initial 'nudge' impact, the students only received the initial text and email; the subsequent escalation workflow was not followed.
The initial number of students within the scope of the 10 courses was approximately 1,800; through the semester's duration, any student who formally withdrew from a course was automatically removed from the study. Approximately 30 students withdrew from one or more courses in scope after week 2. To ensure students who would usually be referred through the ACE system were not disadvantaged, and to comply with our pastoral code, those who met the criteria for the usual ACE process were also excluded from the trial. Throughout the trial, 82 students were excluded for meeting the full ACE criteria. These students had disengaged from all enrolled courses and were followed up as usual through the ACE system workflow. A total of four students without a valid New Zealand mobile number were also excluded from the project, as this was required for mobile messaging.
Each week, a report was extracted from the LMS providing activity details for students enrolled in the scoped courses. Those students who still needed to access an enrolled course via the LMS within the previous seven days were separated into Test and Control groups; a random 50% split with no stratification was applied to achieve this. The Test group were sent a nudge, a text message, and an email via an automated workflow set up in the University's SMS. 3 The nudging process was repeated for the courses in scope each week for the second semester of 2022. However, it did not include the first week of each term (as students would not be expected to engage with the material the week prior) or the weeks of the mid-semester break. Test and Control groups were then studied to assess differences in online LMS activity. This included access of resources and activities, the timing and frequency of system logons, submission of assignments and forum postings, and time spent engaging with the system. Any activity within the course was considered a proxy measure of engagement. Performance in the course was also measured through their final Grade Point Average (GPA).
We expected those who received the nudge would re-engage with the LMS quicker and at a higher rate than those who did not. It was expected that all students identified by the ACE process (regardless of placement in Test or Control groups) would receive a lower GPA than those not identified as at risk during the project. However, we expected those who received the nudge to achieve a higher GPA than the control group.

Engagement
In this project, 319 students received interventions over the semester, and only two received the intervention message on more than one occasion. The level of LMS activity for each student in the Test and Control groups was studied for the courses in scope each day prior to and post-intervention (if any). The number of days post-intervention (Intervention Days) is shown on the x-axis in Figure 2, where Day 0 is when the nudge occurred. The activity level (Engagement) is the total number of actions students performed within the LMS for the flagged course. In Figure 2, the Engagement measure was aggregated by intervention day, and the mean is shown for each group.

Change in LMS Activity Between the Test and Control Group Students for Selected Courses
As shown in Figure 2, the level of engagement for both the Test and Control groups is zero at intervention Day 0. This is expected as this identifies the student as being within the scope of the analysis. It is clear from this figure that the Test group of students (those that receive the nudge intervention) showed higher levels of engagement on average compared to the Control group over the following seven days. The Test group engagement peaked around 39% higher than the Control group on Day 5, then the differences declined and became volatile from Day 9 onwards (not shown).
To assess the time spent by the students in these groups on the courses, a Time on Task (ToT) measure of engagement was also developed. This measured the estimated time (in minutes) a student spent within the course page on the LMS over the seven days before the nudge intervention. Figure 3 shows the ToT results in the same format as Figure 2, the average time in minutes by intervention day. Figure 3 shows that the mean time spent on the courses for students in the Test group (i.e., those who received the nudge) was higher than the Control group students who did not. This is significant as it indicates that students were not simply clicking through the material to manage the intervention process but were investing more time engaging with online course materials. In this specific study, the Test group showed 35% more time spent on course material after seven days when compared to the Control group. This metric dropped below 10% after 14 days (not shown), suggesting a sustained increase in time spent engaging with LMS materials by the Test group. Figure 4 shows the distribution of groups for each Grade 4 . The Control group were identified as disengaged but did not receive any intervention, and the Test group were identified as disengaged and received a nudge. Students not identified as disengaged are grouped in the plot as None. As can be seen, students with higher grades have a lower proportional representation of students identified as disengaged compared to students who received lower grades. This provides evidence that ACE helps identify students struggling in their courses compared with their peers. Of the students identified as disengaged in this project, 21.5% received a fail grade (D or E), and fewer than 6% achieved an A or higher grade. Furthermore, the performance of those students identified as disengaged earned an average GPA of 1.67 points lower than those that did not. The hypothesis that identifying cohorts of students for intervention has no impact on the mean GPA for those cohorts is rejected with very high confidence. With this hypothesis, we would expect to see a difference larger than 0.53 points less than 0.1% of the time using a one-tail test with a normal approximation for the difference in cohort means.

Distribution of Students Across Interventions by Final GPA
There was, however, no statistically significant difference in GPA performance between the Test and Control groups across all the students with high levels of confidence. Assuming the intervention had no impact on the final GPA, the absolute difference in GPA scores (0.35 points) between the Test and Control groups occurs at this level (or higher) with a probability of approximately 0.285. This study had a relatively small sample size making broader implications of the interventions challenging to establish. For example, while the impact of the nudge within the 10 courses in the scope of the analysis is clear, the students were also enrolled in courses that were not within the scope of the study. Given the large number of possible additional enrolled courses and significant variability in course structures in the LMS, assessing any broader impacts of the nudge proved challenging as the many different combinations of courses were not comparable. As a result, the data required normalising so that both the Test and Control groups were comparable at Day 0. Figure 5 shows engagement across all enrolled courses (excluding the 10 selected) and includes an additional data point to show the Test group adjusted so that Engagement activity at Intervention Day 0 is comparable to the Control group. This then shows relative engagement only. Interestingly, the normalised Test group engagement remains higher than the Control group. This analysis suggests that the nudges have a broader impact on students' engagement within all their courses (and not just those they are identified as disengaged from). Randomising the Test and Control groups for analysis removes this pattern, supporting the finding that receiving a nudge message has broader impacts on other courses.

Discussion
This study aimed to assess the impacts of a targeted nudge on second-year tertiary students. Results found that ACE is successful in determining those students who are disengaging from their studies-receiving a text message nudge when they were identified also impacted students' behaviour, with the Test group students' engagement peaking at 39% higher than the Control group's five days after the nudge was sent. More importantly, students who received the nudge spent significantly more time engaging with online course materials than those who did not. This emphasises that appropriately providing a nudge to students creates behaviour change and impacts re-engagement with critical online resources. This supports the findings of Lawrence et al. (2021), who found that nudging improves student engagement. Interestingly, the sustained impacts spanning over two weeks provide an important finding highlighting the power of the nudge-particularly as prior research has highlighted the lack of assessment around the sustained benefits of nudging (Caraban et al., 2019).
The analysis of students' GPAs found that those students identified as disengaged had lower final course results than those not identified as disengaged. In this study, disengagement, on average, is an overrepresented behaviour of lower achievers, a finding that mirrors prior research (e.g., Crampton et al., 2012;Hampton & Pearce, 2016). Although no significant differences were found between those who received the nudge and those that did not, this may be due to the small sample size.
Additionally, more information could be provided to students on their dashboards about what specific resources and materials would benefit their studies. This is a piece of ongoing work detailed in the following section. However, this study's most crucial finding highlighted the nudge's power in driving increased engagement across all enrolled courses. This transferable benefit shows that nudging a student even for one course has an impact more broadly across students' studies. Logically, without engagement, there cannot be success. Therefore, tools and strategies that target student engagement are a reasonable first step to enhancing student success in tertiary environments.
This study had limitations. Compared to first-year students, second-year students have already successfully navigated their first year and arguably are a more experienced and engaged cohort. Past research has shown the criticality of passing the first year of tertiary study in relation to overall tertiary success (Nelson et al., 2012). The results presented in this paper might not trivially extend to students in their first year. Additionally, other measures could be used to capture engagement. For example, students may be fully engaged in class but less online in the LMS. In some courses, such as physical skills courses like sculpture and performance music, online engagement is less important than physical attendance. As a result, a system such as ACE has less power to capture disengagement. There is a need for more reliable and consistent data in this area. Furthermore, the large variability in the quality and presentation of various learning resources within different courses in UC's LMS means that not all courses are created equal. Teaching staff are also able to message students via a course-specific dashboard should they wish. Although staff at UC are aware that ACE centrally manages and intervenes with students for non-engagement, there is a possibility that some well-meaning and passionate staff could be influencing the results if they are also following up students. Future analysis will determine the extent to which staff members are messaging students directly for nonengagement.
While this study did not show any significant difference in average grades across the courses, planned development on the student-facing engagement dashboards could change this. Students' engagement dashboards encourage self-directed engagement and learning; improving these may improve GPA for nudged students. Planned developments will show users the top five activities and resources accessed over the past seven days, guiding them to the work that their classmates are engaging in. Previous literature has shown the link between engagement and grades (Campbell et al., 2006;Campbell & Oblinger, 2007;Macfadyen & Dawson, 2010). Therefore, providing students specific information on what they should engage with and what time could enhance grade outcomes. Further research will assess whether this is the case. Improvements to UC's SMS will also allow us to distinguish the types of information students seek when they disengage. It may be that different groups or cohorts are looking for specific support. An investigation will determine whether further personalisation of messaging based on will improve engagement. The present study assumes a one-size-fits-all approach. More analysis is needed to assess differences in messaging for different cohorts based on data already held in internal systems. For example, if students have not engaged and have got a low grade on an early assignment, more targeted interventions could improve outcomes. Additional work will address the potential for this option.
ACE is a multi-unit, pan-institution program covering the breadth of the university's population. Although not discussed in detail in this paper, the nudge messaging students receive is supported by a resource-heavy human-centred workflow involving student advisors, academic staff, and support services who follow up with students who do not reengage with their courses online following the automatic nudge. Around half of the students picked up by ACE are supported through this extended network of staff. However, an automated nudge improves engagement behaviours for those students who take heed and reengage online. The benefit of an automated system, even without the human resource associated with an institution-wide network, may produce engagement gains when appropriately targeted and delivered. This could provide a solution to institutions that may not have the resources for a fully developed program such as ACE (although it must be remembered that nudging may have consequences and impacts on support services that need to be considered). Harnessing such automated technology frees time for educators to focus on the central mission of teaching and learning. The power of nudging, supported by a degree of automation, allows tertiary education providers to harness technology to improve engagement and ultimately impact student success.