OU blog

Personal Blogs

Christopher Douce

AL development day March 2026: personalised learning and assessment

Visible to anyone in the world
Edited by Christopher Douce, Thursday 19 March 2026 at 20:57

On 12 March 2026, I attended a cross faculty AL professional development day that was all about ‘Personalised Learning and Assessment in Online Tuition’. The sessions were intended to provide ‘opportunities for peer learning and for sharing experiences and best practices that support teaching excellence’.

I’ve been to quite a few AL development events over the years; I always try to go if I can. I feel there are always useful points I can pick up. Two important aims of this event (amongst others) was to ‘understand the importance of personalised tuition in distance higher education’ and to engage in ‘peer-to-peer learning’ through ‘sharing effective strategies for personalised support and fostering a collaborative community of practice’.

Throughout the day, I attended four sessions. The notes that I have collated from the sessions are shared below. The first two sessions relate to my role as an associate lecturer (AL), the second two relate to my role as a practice tutor (PT). If you’re internal to the OU, you should be able to access some resources that accompany each of the sessions through the ALSPD website.

Session 1: What makes good written feedback

This first session was facilitated by Daniel Russell, Student Experience Manager from the Faculty of Business and Law (FBL), and Allan Mooney, Senior Lecturer, who also from FBL. An important aim of the session, as described on their event abstract was to ‘highlight strategies for balancing encouragement with developmental guidance, fostering student confidence and engagement and its impact on motivation and progression’. Breaking this down further, a key aim was to help us to understand ‘what constitutes good TMA feedback and feed forward guidance’.

Considering feedback

We were asked a question: what is feedback? I noted down a definition (which I have loosely paraphrased): “Feedback is information given to students to help them to learn about their performance relative to learning goals and outcomes”. Feedback is a subject that has much wider and broader relevance. It is an issue that is familiar within the HE sector. It was highlighted that in the annual national student survey, students consistently gives the lowest scores are for the effectiveness of feedback.

Tutor-Student feedback is also important internally, and contributes to important quality control measures. From an institutional perspective, external examiners report a lack of consistency in feedback. A personal reflection is that as an external examiner, the comprehensiveness of feedback is an incredibly useful indicator.

Some useful points were shared. Sometimes students might not have the personal capacity to respond to all the feedback that is provided (in my own practice, I try to avoid leaving more than three main points, but I often leave more). Students also need to have an appetite to receive and work with feedback. Also, tutors can leave feedback that can help students connected different TMAs together.

Academic sources

During the session, we were directed to some articles. Sadler (1989) suggested that ‘students must know what good performance is, how current performance relates to good performance, how to close the gap’. Here’s a reference the presenter shared:

Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18(2), 119–144. Available at: https://doi.org/10.1007/BF00117714.

Another source was Nicol and Macfarlane-Dick (2006) who presented a model of self-regulated learning.  Again, here’s the article:

Nicol, D.J. and Macfarlane-Dick, D. (2006) ‘Formative assessment and self-regulated learning: a model and seven principles of good feedback practice’, Studies in higher education (Dorchester-on-Thames), 31(2), pp. 199–218. Available at: https://doi.org/10.1080/03075070600572090

Reflections on feedback

Digging into the Nicol and Macfarlane-Dick resource, which shares 7 feedback principles, here are three that I consider to be are most significant in the OU context:

  1. Deliver high quality feedback information.
  2. Encourage positive motivation and self-esteem.
  3. Provide opportunities to close the gap between current and desired performance.

The third point is particularly helpful. It is important to share something to students about what good performance looks like. From a practical perspective, it is useful to share examples, and to explain what is interesting or important about those examples. Regarding the second point, the tone in which point of feedback are shared is also really important. This links to the idea of student self-efficacy – sometimes there may be a mismatch between what someone believes they are able to do, and what they actually can do. It is important to foster self-confidence.

As a tutor, an important question I ask is: where can I add value? I remember some debate about the balance between script and eTMA summary page. My own view is to provide feedback on the script. and feedforward on the eTMA summary. I also might try to include additional teaching points, and link what is studied in one TMA to other TMAs, or other aspects of the module.

I also remember being encouraged to consider how they receive their feedback. On more than one occasion I’ve found out my TMA score by using my phone. I think I learnt about my last TMA score when I was travelling on a train.

Further Resources

The Powerpoint that accompanies this session contained a number of resources. In addition to the ones that I have shared above, the following might be useful:

Session 2: Taking Feedback forward

The next session, facilitated by Claire Malcolm, continued the theme of feedback.

We were asked some questions: how long do you spend marking each TMA? How is this time split between script comments and eTMA summary? And how often do you repeat or rephrase? For a typical module the OU budgets that it should take a tutor 45 minute to mark an assessment. In my own tutor practice, I only ever manage to achieve this when I’m thorough ‘warmed up’ and I have managed to ‘get into the head’ of module team through the tutor notes and guidance they have prepared. This means that there can be a difference between how much time the university expects we spend, and how much time we (as tutors) do actually spend marking assignment and composing feedback.

We were asked further questions, such as do students read, understand, and apply the feedback? Also, what techniques do you use to ensure that marking does not become over time-consuming?

One of the techniques that could help us is the idea of creating and using a comment bank; a document that we can use to store comments that we may reuse (and then customise). Another term for this that I’ve heard is a ‘copperplate’, which can be though of as a template. Comment banks can be used to capture feedback or feed forward comments, but their use should not replace personalisation.

A good example of feedforward personalisation is where a tutor takes a bit from a student’s essay, and offers an example of how it might be presented in another way. In some cases, it might be useful to follow up with a discussion afterwards. Examples of exemplars can be helpful.

Another practice is to use something that was called a ‘feedforward grid’ that offers examples and potential useful exercises (to understand what they need to do to improve) that are related to learning outcomes. The exercises that may be suggested are, of course, not compulsory and won’t be marked. The feedforward form could also be used to share examples too. It could also be something that could be prepared by a module team. A final point I noted down was, of course, that a grid is not a substitute for on script comments or eTMA comments.

A variation of this kind of grid is used on one of the modules that I tutor, the computing undergraduate and project module. The difference between what was suggested and what I use is that the tutor have a lot of scope to prepare their own comments.

Session 3: Steps to Holistic KSB Assessment

The third session, facilitated by Helen Sanson, STEM practice tutor, reflected a change of focus, and one that relates to my involvement with the degree apprenticeship programme. The overview of the session mentioned the apprenticeship End-Point Assessment (EPA), emphasising that it ‘is not a multiple-choice test; it is a demonstration of professional competence’, which is assessed through a professional conversation. The session was also described to ‘challenges the traditional "tick-box" culture and offers a rapid guide to holistic assessment for Practice Tutors’ which is something that really appealed, especially since I have been thoroughly tiring of the way that many interactions are necessarily mediated through forms.

Checklist vs Holistic

Practice tutors must ensure that apprentices capture evidence to meet a defined apprenticeship standard, which is defined using descriptions certain knowledge, skills and behaviours (KSBS). A point was made that evaluating the progress of an apprentice is more than just ticking off progress, of whether (or not) an apprentice has met the necessary criteria. A checklist approach can assess whether an apprentice has performed, rather than how well they performed.

During an EPA, the assessor gives two main outcomes: they pass (demonstrates their competence) or fails. The EPA should show what they can do, and know how to act. Whilst the KSBs can appear like ‘mastery statements’, the EPA should be thought of a holistic assessment, that looks at the bigger picture. An EPA is all about communication about real world scenarios.

Sharing steps

What follows is a simple framework that was shared, that I have vigorously paraphrased:

Step 1: Spot it – identify naturally occurring evidence. Prevent apprentices relying on academic work to evidence their KSBS, but instead focus on real-world evidence: witness statements, screenshots. Encourage apprentices to collect evidence from day one.

Step 2: Map it – apprentices should only submit evidence (to their e-portfolio) if it relates to a KSB. Do reflect on the following principles of assessment: validity, authenticity, sufficiency, currency and reliability (VASCR). Consider creating a document that shows ‘clustered contextual evidence’, or as a way to summarise evidence.

Step 3:Feed it back - A framework to help to think about how apprentices have developed in their practice. What?: Can the evidence be used to describe a workplace situation or problem (and what KSBs can be demonstrated)? So what?: What process was followed, what was done?. Now what?: What was the output? Has there been some personal or organisational benefit?

A phrase that I noted down towards the end was: ‘Assess the cake rather than the ingredients – to make sure they are ready within their careers’.

Session 4: KSBs and OU apprenticeship programmes: a panel discussion

This final session was different to the others. It was described as follows: ‘featuring a panel of faculty representatives from WELS, FBL and STEM, this session will unpack how KSBs underpin programme design, teaching, and assessment across apprenticeship pathways’. What followsis a set of brief headings which reflect some of the discussions, under which are some accompanying points.

What is the purpose of KSBs in supporting apprentice learning?

Knowledge skills and behaviours (KSBs) are things that apprentices must know.

Can be useful to bridge theory and practice.

They can help apprentices to understand where to develop skills and abilities.

Which KSBs are the most challenging to evidence?

Those that relate to work-based experiences, especially if the apprentice’s organisation is subject to significant amount of change.

What questions could explore the apprentice and employer’s understanding of the KSBs?

What can you tell us about what you have done?

What else have you done?

Can you give me an example?

What was the impact?

What did you learn from doing that?

What do you need to gather to show you have achieved this?

What are the next steps?

What does higher quality OTJ training looks like within your programme?

OTJ means Off the Job Time. It includes time allocated to academic study.

A challenge is to make sure that apprentices record their hours, and make a note of how they have used their time.

In some cases, the employer may allocate time for the attending of conferences or events that relate to the role.

How can university and employers work together to support KSB progression?

It is necessary to interpret what KSBs mean in the context of a particular role. It is also necessary to relate them to the roles within the sector in which they are working.

Think of the KSBs as a guide.

Reflections

I remember the time when there used to be large cross-faculty events that were organised from a regional centre. Along with a nice lunch, meeting fellow associate lecturers was one of the motivations to go to these events. Not only would they have cross-faculty sessions, but there would also be faculty specific meetings. There would also be these informal coffee-point chats, where you may share experiences and pick up on new ideas whilst chatting with colleagues of a coffee and a biscuit.

I feel that these online cross faculty events work well for the university, but less so for the delegates. I find online tiring, but also appreciate that online is accessible, as well as being cost effective. The sessions I attended were quite short, and sometimes felt very formal. In my experience, online always takes longer (but, of course, it does depend on what is done, and how everything is facilitated).

In the sessions I attended there was very little opportunity to chat. Discussions, of course, work better, if you know more about those who you are speaking with, and a big variable is, of course, the experience and confidence of the facilitator. When I used to facilitate sessions, I was always really aware of there being a lot of experience in the room. More often than not, there were colleagues who were even more experienced than I was. I kind of felt that it was my task to find those people, and to get them talking.

Due to the level of interactivity being very attenuated, the objective of fostering a community of practice was a long way from being met. I didn’t recognise many of the names of the colleagues who were also attending (other than, of course, a very nice surprise of noticing someone who I haven’t seen for a good few years).

I’ve always asserted that the most important thing in education is, of course, people. When it comes to academic professional development, there is always a risk that technology can acts as a barrier to communication, just as much as it always has the potential to bring people together. A panel discussion without questions isn’t a panel discussion; it is a presentation.

There were some positives: the sessions about feedback helped me to reflect on my practices, and the references to the academic articles was welcome. From the practice tuition perspective, the critical approach that was taken in the third session was appreciated. It also offered a useful reminder about the purpose and significance of the end point assessment.

Acknowledgements

Many thanks to the ALSPD team who set up the event, and all the facilitators who worked hard to both design and run the sessions. 

Permalink Add your comment
Share post

This blog might contain posts that are only visible to logged-in users, or where only logged-in users can comment. If you have an account on the system, please log in for full access.

Total visits to this blog: 3644087