Personalised Emails in First-Year Mathematics: Exploring a Scalable Strategy for Improving Student Experiences and Outcomes

experience and outcomes was assessed through surveying and statistical comparisons to previous cohorts. It was found that students perceived the personalised emails favourably and believed the intervention would contribute to them achieving better grades. This translated to a statistically significant improvement in both student experience and academic performance in one of the courses. The results imply that personalised emails are well-suited to courses taken in students’ first semesters of university study, aiding those transitioning to the higher education environment by fostering feelings of belonging, supporting effective engagement, and easing navigation of university systems and processes.


Introduction
There has been a strong drive internationally toward improving access to and participation in higher education, given the social and economic benefits of more educated populations (Devlin et al., 2013). This article focuses specifically on the Australian context, where these agendas have been progressed via significant public policy changes over the last decade, including the cap being lifted on student enrolments (Norton, 2014) and raised widening participation targets (Gale & Parker, 2014). Collectively, these developments have grown overall student numbers while also evolving the backgrounds of those pursuing higher education. Where in the past tertiary students were largely high-achieving school-leavers from affluent families, today's students are increasingly from lower socio-economic families, entering university later in life, gaining admission through alternative entry schemes, or becoming the first in their families to pursue a degree (Edwards & McMillan, 2015;Gale & Parker, 2014).
Research has shown that students from non-traditional backgrounds tend to have poorer outcomes across progression, retention and completion measures (Edwards & McMillan, 2015;Kahu & Nelson, 2018). This is attributed to non-traditional students experiencing greater challenges integrating into the social and academic spheres of university life (Tinto, 2006). These issues are most visible in the transition to university where students need to form new personal connections, engage with new Widening participation initiatives in higher education have grown overall student numbers while also increasing the diversity of student cohorts. Consequently, enhancing student experiences and outcomes has become increasingly challenging. This study implemented personalised emails in two first-year mathematics courses as a scalable strategy for supporting students with diverse needs. Impact on student experience and outcomes was assessed through surveying and statistical comparisons to previous cohorts. It was found that students perceived the personalised emails favourably and believed the intervention would contribute to them achieving better grades. This translated to a statistically significant improvement in both student experience and academic performance in one of the courses. The results imply that personalised emails are well-suited to courses taken in students' first semesters of university study, aiding those transitioning to the higher education environment by fostering feelings of belonging, supporting effective engagement, and easing navigation of university systems and processes.
technologies and ways of working, as well as assume greater responsibility for their learning (Kahu & Nelson, 2018;Wingate, 2007). Supporting students through this transition on an individual basis is resource-intensive and thus not feasible in a climate of reduced funding , while cohort diversity means a "one-size-fits-all" approach is inappropriate. Instead, technology-based solutions need to be considered as these can be administered efficiently at scale, and when coupled with data, can be personalised to accommodate students with varying needs .
One technology-based approach to student support is personalised emails, where analytical insights from online engagement and course performance are used to develop messages customised to individuals (Lim et al., 2019). This gently nudges students toward actions tailored to their circumstances, easing navigation of the university environment (Lawrence et al., 2019), increasing their responsibility in the learning process (Fritz, 2017), and supporting learning self-regulation (Lim et al., 2019). However, limited research has investigated the approach's potential as a semester-long strategy for supporting students in course-based settings. Consequently, the present study aims to evaluate the potential of personalised emails through cases studies performed in two first-year mathematics courses at a large Australian university. Impact on course experience and academic performance are explored.

Higher Education Context
Widening participation has substantially increased the heterogeneity of student cohorts, both in terms of socio-demographic characteristics and educational backgrounds (Devlin et al., 2013). Of particular interest for the science, technology, engineering and mathematics (STEM) disciplines has been the decay in mathematical skill levels of commencing students over the last decade, especially in fundamentals like algebra and calculus (King & Cattlin, 2015). This has been driven by the increase in mature-age students who typically have limited recent exposure to such topics (Mulryan-Kyne, 2010), while there has been a decrease in students studying intermediate and advanced mathematics in high schools (Barrington & Evans, 2016). These factors combined with many Australian universities removing hard prerequisites for entry to STEM programs (in favour of assumed knowledge standards which are not verified at enrolment), has meant a growing proportion of students are attempting STEM degrees without the mathematical grounding demanded in years past (King & Cattlin, 2015).
The shift in student cohorts has also occurred alongside an increase in the complexity of higher education environments. This complexity is being propelled by funding pressures increasing staff-student ratios, technological developments disrupting how teaching is delivered and learning is consumed, as well as expanded governance impacting university systems and processes (Lawrence et al., 2019). Navigating this can be extremely challenging, especially for students working from a position of reduced preparation, competing priorities (like employment and caring), limited experience in contemporary learning environments, and less familiarity with university operations (Kahu & Nelson, 2018). Unfortunately, students from nontraditional backgrounds tend to be over-represented in these attributes, contributing to the significant discrepancy in outcomes (Edwards & McMillan, 2015;Gale & Parker, 2014).
Student belonging and engagement are widely acknowledged as underpinning student progression and retention (Masika & Jones, 2016). This is because students who feel valued, included, and cared for within their institution are more likely to persist in their studies (Masika & Jones, 2016), while engaged students (broadly defined as those spending time, effort and resources on learning activities) are more likely to be academically successful (Kahu & Nelson, 2018). Consequently, developing students' sense of belonging and facilitating effective engagement are considered mechanisms for enhancing students' higher education experiences and outcomes (Masika & Jones, 2016). Students' sense of belonging is heavily influenced by their interactions and relationships with peers and instructors (O'Keeffe, 2013). However, those studying part-time, externally, and alongside other responsibilities can be disadvantaged due to fewer opportunities to develop these personal connections (Kahu & Nelson, 2018). Large cohort sizes, typical of first-year courses, can also be a barrier as informal interaction is greatly hindered in these contexts (Mulryan-Kyne, 2010). Consequently, students can develop anonymity and feel less empowered to seek timely help from their peers and instructors (Mulryan-Kyne, 2010). Students entering higher education are also required to evolve their learning strategies to account for the heightened focus on self-directed learning and constructing higher-order knowledge (Wingate, 2007). As many students initially perceive learning as a passive process, shifting students' mindsets toward active engagement that requires them to think more critically about the subject matter is vital (Wingate, 2007). However, this tends to be a challenging adjustment, so scaffolding is needed to minimise the chance of students becoming overwhelmed and disengaged (Lawrence et al., 2019).

Advantages of Supporting Students through Communication and Feedback
To support students in the transition to university, many institutions have developed extra-curricular programs that target "atrisk" groups (Wingate, 2007). However, these are often based on a deficit approach, where referred students are regarded as lacking abilities, or limited by their cultural or economic backgrounds (McKay & Devlin, 2016). This can marginalise and alienate students within the targeted groups, impeding their participation (McKay & Devlin, 2016). Moreover, the extracurricular nature means these programs only cater to a small proportion of students, and typically deliver generic content that students can perceive as irrelevant to their course-based learning (Wingate, 2007).
Communication and feedback embedded into routine learning and teaching practices can overcome the deficit, relevance and reach issues of extra-curricular models in supporting student transitions. Messaging can be developed using strengths-based language that acknowledges the unique contributions of individual students, and by delivering this within course-based environments the support is timely and highly contextualised (Wingate, 2007). Additionally, using this approach means all students can benefit, not just those who are deemed "at-risk" based solely on their background characteristics (Kahu & Nelson, 2018). Wingate (2007) argues that most university students (not just those from non-traditional backgrounds) need additional help in "learning how to learn" during their transition to university. Both Wingate (2007) and Lawrence et al. (2019) further contend that students benefit greatly from explicit communication and feedback from their instructors that work to manage expectations, unpack the complexity of university systems and cultures, guide students to appropriate supports, and facilitate the development of independent study skills. Interactions between students and instructors are subsequently central to driving engagement as well as retaining students in the learning process (Trowler, 2010). There is also evidence that high levels of student-instructor interaction combined with high levels of student-content interaction lead to satisfying course experiences and deep learning (Nieuwoudt, 2018).
Students use feedback to close the gap between their current and desired states of understanding and achievement. However, according to the model by Hattie and Timperley (2007), for feedback to be most effective, it needs to answer questions about where students are heading, how they are progressing, and what they should be doing next. Providing this promptly to a diverse group is challenging, especially given the declining time instructors have to dedicate to this on a per student basis . Therefore, automated methods need to be used for facilitating this feedback cycle.

Previous Studies on Nudging Student Behaviour
Nudging involves gently prompting individuals to behave in a certain way, with data sets often used to personalise these recommendations (Fritz, 2017). There are a small number of previous studies where nudges delivered via email have been coupled with learning analytics to support tertiary students. For example, researchers at Purdue University developed "Course Signals" which drew grade, learning management system (LMS) engagement, and background characteristic data into statistical models to predict students' course outcomes (Arnold & Pistilli, 2012). Based on this, a traffic light rating was displayed to students on the LMS, and instructors could develop personalised emails to guide students to supports. More recently, Lawrence et al. (2019) targeted disengaged students studying online with personalised nudges. This was shown to increase perceptions of support afforded by the instructors, positively influencing engagement. Pardo et al. (2019) delivered personalised emails to undergraduate engineering students during the early weeks of the semester based on levels of interaction with online resources. This was found to significantly increase students' course satisfaction while a marginal improvement in academic performance was recorded. Finally, Lim et al. (2019) used "OnTask" software that integrated with institutional datasets to create personalised emails. It was demonstrated in a first-year course that the intervention led to students achieving better final grades irrespective of their incoming skill level.
Although learning analytics-based systems for delivering feedback to students are becoming increasingly prevalent, there has been relatively little analysis of their impact on student outcomes (Lim et al., 2019). Moreover, there has been little investigation into how students perceive this intervention in practice, especially when implemented as a semester-long support strategy. Consequently, this study aims to address these gaps by introducing personalised emails into two first-year mathematics courses, forming case studies to explore student experiences of the intervention and impact on academic performance.

Participants and Setting
Two undergraduate mathematics courses run on-campus at a large Australian metropolitan university over a 13-week teaching semester were considered. These courses were Quantitative Methods in Science and Introductory Calculus and Algebra.
Quantitative Methods in Science was a compulsory course for science students covering algebra, calculus, probability and statistics. It required students to solve mathematical problems by hand, as well as apply learned concepts to large datasets using R © software. It was designed for students to take in their first semester of study, however, those without a strong background in high school mathematics were advised to withdraw and complete a foundational course before re-enrolling the following semester. A diagnostic test, which asked students to solve a series of problems based around assumed knowledge was used to assist students in making this decision. The course was perceived as challenging, supported by the failure rate averaging 28% historically, while the nature of the cohort as science majors rather than pure mathematics majors also meant student resistance and poor confidence in mathematics posed significant barriers to engagement. The course was delivered through weekly lectures, computer labs, and workshops, while assessment consisted of problem-solving tasks (40% weighting), regular online quizzes (30% weighting), and a collaborative scientific article (30% weighting).
Introductory Calculus and Algebra covered differentiation, integration, basic differential equations, vectors, matrices and complex numbers. It was compulsory for science students in physics and chemistry majors (and optional for other science majors) and designed to be taken directly after Quantitative Methods in Science. The course also formed part of the mathematics minor that any undergraduate student could elect into, with business and IT the most common disciplines for this. In 2018, the course was also compulsory for secondary education students specialising in advanced mathematics. To attempt the course, students were assumed to have achieved a pass or better in senior high school mathematics. However, it was known that some students did not have this background or had not used these skills for many years. This combined with the wide range of discipline areas made cohort diversity a challenging aspect of the course. Introductory Calculus and Algebra also had a high failure rate averaging 29% historically. It was delivered via weekly lectures and workshops. The assessment consisted of problem-solving tasks (50% weighting) and a final exam (50% weighting).
For Quantitative Methods in Science, cohorts in Semester 2, 2018 and Semester 2, 2019 were exposed to personalised emails, while Semester 1, 2018 and Semester 1, 2019 cohorts served as controls. In Introductory Calculus and Algebra, cohorts between Semester 2, 2018 and Semester 2, 2019 were exposed to personalised emails, with Semester 1, 2018 serving as the control. Each course's instructor was consistent across the study period except for Semester 1, 2019 of Quantitative Methods in Science where an alternative stepped in. Turnover occurred between semesters for casual academics employed to run workshops and computer labs. Course structures were maintained throughout the study period. Overall, 346 students for Quantitative Methods in Science and 574 students for Introductory Calculus and Algebra were exposed to the personalised emails, with key cohort statistics summarised in Table 1.

Design and Delivery of Personalised Emails
The personalised emails aimed to improve student experiences and outcomes by fostering student belonging within the course, increasing the effectiveness of course engagements, and dispelling the complexity of university systems and processes. To support the belonging objective, students were addressed by name and a conversational tone was used to encourage the student to feel like the instructor was personally engaging with them. To promote effective engagement, actions were recommended based on online behaviour and performance data, the use of actions linked to success was reinforced, and referrals to support mechanisms were tailored. Finally, to aid navigation of the university environment, timely and explicit communication was used to support students in understanding university processes and their implications (such as withdrawing from courses, applying for extensions, and interacting with the LMS).
Personalised emails were generated using the mail merge functionality of Microsoft Office (specifically utilising Excel, Word and Outlook). As illustrated in Figure 1, this involved: 1. Generating an initial spreadsheet in Microsoft Excel that included the identification number, name and email address for every student in the course. Raw data from systems such as the LMS and external quiz system were appended using lookup functions. 2. Developing if-then rules that linked the raw data to appropriate messaging. 3. Implementing the if-then rules as additional columns in the spreadsheet. 4. Developing a mail merge template in Microsoft Word through the mailings tab that connected to the spreadsheet, allowing the personalised messaging to be interspersed throughout the standardised text. 5. Sending the personalised emails via Word's automatic connection to Outlook.

Process for Generating Personalised Emails
In Quantitative Methods in Science, personalised emails were sent intermittently throughout the semester, largely concerning the diagnostic test and summative assessment. For example, based on their score in each diagnostic quiz topic, students were directed to bridging resources or congratulated on a good result, and advice was given for whether the student should consider withdrawing to complete a foundational mathematics course. Similarly, summative assessment grades were communicated alongside messaging complimenting excellent results or reminding of supports. Other trigger points included assessment due dates, key dates in the university calendar, and students going extended periods without interacting with the LMS.
In Introductory Calculus and Algebra, personalised emails were sent weekly. These replaced traditional announcements, and thus largely focused on communicating administrative aspects of the course but with personalisation to improve relevance. In addition to the triggers used in Quantitative Methods in Science, the personalised text was used to introduce the specific tutor and their contact details, share formative quiz results with a comment such as "well done" or prompt to related supports, and to remind those yet to submit specific assignments of the process alongside instructions for applying for an extension. Similarly, assessment grades were communicated with instructions on how to access the associated feedback. Students who did not view this feedback were prompted to do so before submitting the next assignment while those who had viewed it were complimented to reinforce the behaviour.

Data Collection and Analysis
Data was collected under the approval from Queensland University of Technology's Human Research Ethics Committee (1800000662). Multiple data sources were considered to more holistically evaluate the impact and triangulate the findings. Statistical tests were performed in R © .
Firstly, the University's end-of-semester student satisfaction survey was used to compare course experiences with and without personalised emails (see Table 1 above for responses by cohort). Specifically, three questions asked on a 5-point Likert scale were considered. To assess whether responses varied significantly between the control and personalised email cohorts, statistical testing was applied. Visual inspection showed scores were not normally distributed but followed a similar shape, so the Mann-Whitney U test was employed. This is a non-parametric test that assumes independent samples (Nachar, 2008).
Secondly, students' final percentages achieved in the course were used to assess the impact on academic performance. As with any intervention implemented in a real course environment, variability in student cohorts presents as a challenge in measuring this directly (Lim et al., 2019). Thus, to account for differences, incoming student ability was considered. Given a large proportion of students did not enter via a tertiary entrance score after school (Table 1), this was not deemed an appropriate measure. Instead in Quantitative Methods in Science, diagnostic test performance was used as a proxy for incoming student ability. In Introductory Calculus and Algebra, grade point average (GPA) before attempting the course was used as the proxy (noting prior maths course result could not be used due to diverse student pathways to the course). Students who had not recorded attempts on these measures were excluded from this analysis. Linear regression predicting final grade percentage based on exposure to the personalised emails intervention and incoming student ability was subsequently applied. Before performing this analysis, the data was checked to ensure the assumptions of linear regression were met.
Finally, an end-of-semester survey about the personalised emails was administered to three selected cohorts who experienced the intervention (see Table 1). This aimed to provide context for the quantitative results through gauging student perceptions of the intervention. The survey was anonymous, estimated to take about five minutes to complete and asked 7 Likert item, 2 open-ended, and 4 demographic questions. In total, 41 students responded to the survey (see Table 1 for breakdown of cohort). The demographic backgrounds of survey respondents aligned with those present at the course level, except for female Introductory Calculus and Algebra students who were over-represented. Responses to "how did the personalised emails impact your experience in this unit?" were thematically coded according to whether they related to student learning experiences or academic performance (the areas examined quantitatively). Using the qualitative analysis framework described in Braun and Clarke (2006), one researcher applied a data-drive inductive approach to identify patterns in the codes. This "bottoms-up" method of analysis enabled an unrestrained examination of student perceptions of impact, and supported interpretation of the quantitative findings.

Impact on Students' Course Experiences
The impact on students' course experiences was assessed used a mixed methods approach. This involved statistical comparisons to control cohorts for student course satisfaction as well as results from the personalised email survey. Figure 2 compares course satisfaction survey results for the control and personalised email cohorts. This shows students who experienced the intervention perceived better learning opportunities, which they were more likely to exploit. Students in the personalised email cohorts also felt more positively about their overall course experiences. Applying the Mann-Whitney U test, the difference between the control and personalised email cohorts was highly statistically significant across all three statements for Quantitative Methods in Science (p<0.010, U = 3628.5, 3592.5, 3126.5 respectively). However, in Introductory Calculus and Algebra, the difference was insignificant for taking advantage of learning opportunities (p=0.269, U = 1206.5), and borderline for the provision of learning opportunities and overall satisfaction (p=0.064, U = 1101, and p=0.087, U = 1124 respectively).
It must be highlighted that a weakness of these comparisons is the variability in underlying student cohorts and other circumstantial factors that could not be directly controlled because of the live course environment. In particular, for Quantitative Methods in Science, the reduced cohort size in personalised email semesters may have played a role in the increased satisfaction, given the literature shows smaller courses typically face fewer challenges providing high-quality student experiences (Mulryan-Kyne, 2010). Nonetheless, the indicated improvements are consistent with other similar studies such as Lawrence et al. (2019) who showed personalised emails positively influenced perceptions of support and engagement, while Pardo et al. (2019) also measured a statistically significant increase in course satisfaction. Additionally, these findings align with student sentiments expressed in the personalised email survey such as, "The personalised emails definitely positively impacted my learning experience!! Loved them!" (Introductory Calculus and Algebra Student). Student perceptions can be further explored to understand how belonging and engagement strategies were impacted.

Average Likert Item Responses (1=strongly disagree and 5=strongly agree) for Quantitative Methods in Science (QMS) and Introductory Calculus and Algebra (ICA)
Firstly, 90% of students agreed (78% strongly) that the personalised emails heightened their course connection. This was also a theme that emerged through the thematic analysis, reflected in comments such as, "[personalised emails] made me feel cared about and a part of the family" (Quantitative Methods in Science Student), and, "personalised emails made me feel more at home within the classroom, assuring me that I was not alone" (Introductory Calculus and Algebra Student). Factors that contributed to this were addressing students by their first name in each email to counter feelings of anonymity, as well as the instructors using a conversational tone that encouraged perceptions of personal engagement. This can also be related to 83% of students agreeing (66% strongly) the intervention made them feel more comfortable interacting with the teaching team, further supported by responses such as, "[personalised emails] made me feel more comfortable approaching [the lecturer] with any questions or help as it took the anxiety and formality away" (Introductory Calculus and Algebra Student). This shows students perceived the emails as genuine and authentic, despite them being generated through an automated process. Moreover, it implies the strategy was effective in empowering students to seek help, which is known to be a challenge, especially in large cohorts (Mulryan-Kyne, 2010). Collectively, this evidence indicates the intervention fostered student belonging within the course environment, contributing to the enhanced course experience (Kahu & Nelson, 2018;Masika & Jones, 2016).
Secondly, in terms of engagement, 83% of students agreed (56% strongly) that personalised emails encouraged more regular interaction with the course, consistent with similar previous studies (Lawrence et al., 2019). Students describing using the emails to adjust their learning strategies was also a theme that arose through the thematic analysis, such as: [Personalised emails] helped me know which topics I was falling behind in and wasn't just a general email to the cohort. For example; I forgot to do the integration weekly quizzes and the personalised email helped reminded me that I needed to do it. (Introductory Calculus and Algebra Student) In Introductory Algebra and Calculus, the regularity of the emails was complimented as being useful for keeping on track: It gave me a checklist of things to do for the week for the particular unit. It helped me keep track of what needed to be done when (such as assignments and quizzes). It encouraged me to do the extra practice questions. (Introductory Calculus and Algebra Student) This can be tied to the model of effective feedback by Hattie and Timperley (2007) which states feedback must communicate where students are heading, how they are progressing, and what they should do next. These needs were addressed through the emails as they provided reminders of key deadlines, fed back progress on formative and summative assessment, and explicitly communicated learning activities that were scaffolded to encourage active engagement. Lawrence et al. (2019) highlight that the use of feedback to ease navigation of the university environment is particularly important for students transitioning to university due to their inexperience interacting with LMS, prioritising tasks, and engaging with university processes.
Although personalised emails had a motivating effect on engagement for the vast majority of students, there was one negative response that stood out. This student described themselves as a "scraper-by" and said they felt "embarrassed" to receive the emails as they wanted to "blend in". Interestingly, although the student was critical, they acknowledged the potential value of the personalised emails: "I do see how they are very personal and if I had done really well I would have the opposite opinion, I would love the personal feedback" (Quantitative Methods in Science Student). This response shows instructors need to be particularly mindful of how messaging may be perceived by lower-performing students. This has been highlighted previously by Arnold and Pistilli (2012) who found some students disliked receiving negative feedback, as well as messaging that was repetitive or nagging in tone.

Impact on Students' Academic Performance
The impact on students' academic performance was investigated through a regression model focused on predicting students' final grades. Results from the personalised email survey were also considered to provide context around how the personalised emails influenced outcomes.
Results of the regression analyses predicting the final percentages are shown in Table 2. Firstly, for Quantitative Methods in Science, both diagnostic test score and exposure to personalised emails were highly significant predictors of course performance. The beta coefficient for personalised email exposure can be interpreted as the intervention increasing students' final course percentages by 12.5% on average. Interaction between diagnostic score and personalised emails had a negative beta coefficient, which translates to students with poorer diagnostic scores experiencing a greater benefit compared to students with better diagnostic scores. However, its significance was borderline. Collectively, these findings imply that all students benefitted from the personalised emails in terms of their academic performance, but there are indications that students who had the least preparation experienced the greatest gain. These results are consistent with Pardo et al. (2019) and Lim et al. (2019) who each observed a small increase in student performance from similar personalised email interventions. The equivalent analysis for Introductory Calculus and Algebra showed no significant difference in performance from the personalised emails exposure (Table 2). Moreover, there was no evidence of interaction between incoming GPA and the personalised emails. Instead, incoming GPA was the only significant predictor of the final percentage.
Student perceptions of the intervention's impact on academic performance were explored in the personalised email survey. This showed that 78% of students believed (56% strongly) the personalised emails would contribute to them achieving a better course grade. This was also supported by the thematic analysis that found many students perceived the emails would improve their performance, such as, "[personalised emails] reminded me to stay on top of things and I truly believe I got a better grade because of it" (Introductory Calculus and Algebra Student), and, "the personalised emails can help and motivate me to get a good grade (strong correlation)" (Quantitative Methods in Science Student). This can be related to the provision of individualised feedback and advice from the instructor enabling students to engage more effectively with course content, both in terms of frequency and quality of interactions (Trowler, 2010). However, this effect was not uniformly experienced by all students, given support needs varied greatly based on individuals' incoming skills. This is supported by a third-year student's comment: For me personally, the emails didn't really impact my experience in this unit…However, I believe that these emails would be really useful to students in their first year or those without a maths background, helping them feel more involved in the unit and up to date with content and assessment items. (Introductory Calculus and Algebra Student) This implies personalised emails are well-suited to supporting students transitioning to university but have a reduced impact on those who have already adapted.
There is a clear discrepancy between the perceived and actual impact on academic performance in Introductory Calculus and Algebraalthough no statistically significant change was recorded, the survey showed students strongly believed the personalised emails positively influenced their performance, and this was slightly greater for Introductory Calculus and Algebra compared to Quantitative Methods in Science. This divergence may be explained by students having at least one prior semester of university experience when they attempted Introductory Calculus and Algebra. Consequently, by this time students may have partially integrated into the university's social and academic systems, reducing the effect of the intervention. The lack of significant finding may also be influenced by underlying differences between the control and personalised email cohorts that were not sufficiently captured by incoming GPA as a measure of student ability. In contrast to the diagnostic test in Quantitative Methods in Science that assessed students' skills in mathematics, incoming GPA captures performance across all subject areas. This may be problematic in mediating the effect of the intervention given the significant diversity in this course (especially in the discipline areas). Thus, inclusion of a subject-specific measure of student ability should be considered for similar future studies to isolate the impact of the intervention more definitively from confounding factors in practice, especially where cohort diversity presents as a challenge. Similarly, results may have been obscured due to changes in students' demographic attributes between semesters (see Table 1) which were not taken into account through the modelling.

Summary
This study implemented personalised emails in two first-year mathematics courses as a semester-long support strategy. These emails were designed to personally engage students through using their names and a conversational tone, as well as incorporate individualised feedback and advice to nudge engagement strategies and ease navigation of the university environment. Student experiences and impact on academic performance were subsequently explored using surveying and statistical analysis. It was found that students perceived the personalised emails favourably and believed the intervention would contribute to them achieving better grades. This translated to a statistically significant increase in both student experience and academic performance in Quantitative Methods in Science but not Introductory Calculus and Algebra. These results imply personalised emails are well-suited to courses taken in students' first semesters, as the intervention has the greatest impact on those transitioning to university by fostering feelings of belonging, supporting effective engagement, and easing navigation of university systems and processes.

Limitations and Future Work
The two case studies considered in this paper have illustrated the significant potential of personalised emails as a support strategy in course-based settings. However, a key limitation of this research is the confounding factors introduced from the live course environment. In particular, variations in teaching staff and enrolment sizes could not be controlled, and thus may have played a role in students' course experiences and performance outcomes, in turn limiting the strength of conclusions. Similarly, differences in underlying student characteristics between semesters (including demographic and educational backgrounds) complicate statistical comparisons between control and personalised email cohorts. Another constraint of the study was the small sample size obtained for the personalised emails survey, which restricts generalisability of the findings.
Future work should aim to build on the findings by addressing the key limitations. In the absence of randomised control experimental designs (which can have negative ethical implications when considered in live course environments), more complex models and analysis techniques should be employed to take account of confounding factors (such as propensity score matching (Lim et al., 2019)). It is also recommended that incoming student ability is considered in terms of specific subject matter rather than a general measure of student aptitude like GPA. Future work should also focus on expanding the personalised email concept beyond mathematics given its applicability to other discipline areas. Collecting evidence of impact across further contexts would serve to broaden understanding of how subject area, course level, and educational culture influence outcomes. Moreover, it would be worthwhile investigating whether specific groups of students experience greater benefit from the personalised emailsthis could include international versus domestic students, school-leavers versus mature-age students, and specific equity groups. Finally, it would be interesting to quantify the effect of using students' names and a conversational tone (targeting sense of belonging) versus the provisioning of personalised feedback (targeting engagement), to gauge the extent to which each of these aspects contribute to experiences and learning outcomes.