OU blog

Personal Blogs

Christopher Douce

AL development day March 2026: personalised learning and assessment

Visible to anyone in the world
Edited by Christopher Douce, Thursday 19 March 2026 at 20:57

On 12 March 2026, I attended a cross faculty AL professional development day that was all about ‘Personalised Learning and Assessment in Online Tuition’. The sessions were intended to provide ‘opportunities for peer learning and for sharing experiences and best practices that support teaching excellence’.

I’ve been to quite a few AL development events over the years; I always try to go if I can. I feel there are always useful points I can pick up. Two important aims of this event (amongst others) was to ‘understand the importance of personalised tuition in distance higher education’ and to engage in ‘peer-to-peer learning’ through ‘sharing effective strategies for personalised support and fostering a collaborative community of practice’.

Throughout the day, I attended four sessions. The notes that I have collated from the sessions are shared below. The first two sessions relate to my role as an associate lecturer (AL), the second two relate to my role as a practice tutor (PT). If you’re internal to the OU, you should be able to access some resources that accompany each of the sessions through the ALSPD website.

Session 1: What makes good written feedback

This first session was facilitated by Daniel Russell, Student Experience Manager from the Faculty of Business and Law (FBL), and Allan Mooney, Senior Lecturer, who also from FBL. An important aim of the session, as described on their event abstract was to ‘highlight strategies for balancing encouragement with developmental guidance, fostering student confidence and engagement and its impact on motivation and progression’. Breaking this down further, a key aim was to help us to understand ‘what constitutes good TMA feedback and feed forward guidance’.

Considering feedback

We were asked a question: what is feedback? I noted down a definition (which I have loosely paraphrased): “Feedback is information given to students to help them to learn about their performance relative to learning goals and outcomes”. Feedback is a subject that has much wider and broader relevance. It is an issue that is familiar within the HE sector. It was highlighted that in the annual national student survey, students consistently gives the lowest scores are for the effectiveness of feedback.

Tutor-Student feedback is also important internally, and contributes to important quality control measures. From an institutional perspective, external examiners report a lack of consistency in feedback. A personal reflection is that as an external examiner, the comprehensiveness of feedback is an incredibly useful indicator.

Some useful points were shared. Sometimes students might not have the personal capacity to respond to all the feedback that is provided (in my own practice, I try to avoid leaving more than three main points, but I often leave more). Students also need to have an appetite to receive and work with feedback. Also, tutors can leave feedback that can help students connected different TMAs together.

Academic sources

During the session, we were directed to some articles. Sadler (1989) suggested that ‘students must know what good performance is, how current performance relates to good performance, how to close the gap’. Here’s a reference the presenter shared:

Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18(2), 119–144. Available at: https://doi.org/10.1007/BF00117714.

Another source was Nicol and Macfarlane-Dick (2006) who presented a model of self-regulated learning.  Again, here’s the article:

Nicol, D.J. and Macfarlane-Dick, D. (2006) ‘Formative assessment and self-regulated learning: a model and seven principles of good feedback practice’, Studies in higher education (Dorchester-on-Thames), 31(2), pp. 199–218. Available at: https://doi.org/10.1080/03075070600572090

Reflections on feedback

Digging into the Nicol and Macfarlane-Dick resource, which shares 7 feedback principles, here are three that I consider to be are most significant in the OU context:

  1. Deliver high quality feedback information.
  2. Encourage positive motivation and self-esteem.
  3. Provide opportunities to close the gap between current and desired performance.

The third point is particularly helpful. It is important to share something to students about what good performance looks like. From a practical perspective, it is useful to share examples, and to explain what is interesting or important about those examples. Regarding the second point, the tone in which point of feedback are shared is also really important. This links to the idea of student self-efficacy – sometimes there may be a mismatch between what someone believes they are able to do, and what they actually can do. It is important to foster self-confidence.

As a tutor, an important question I ask is: where can I add value? I remember some debate about the balance between script and eTMA summary page. My own view is to provide feedback on the script. and feedforward on the eTMA summary. I also might try to include additional teaching points, and link what is studied in one TMA to other TMAs, or other aspects of the module.

I also remember being encouraged to consider how they receive their feedback. On more than one occasion I’ve found out my TMA score by using my phone. I think I learnt about my last TMA score when I was travelling on a train.

Further Resources

The Powerpoint that accompanies this session contained a number of resources. In addition to the ones that I have shared above, the following might be useful:

Session 2: Taking Feedback forward

The next session, facilitated by Claire Malcolm, continued the theme of feedback.

We were asked some questions: how long do you spend marking each TMA? How is this time split between script comments and eTMA summary? And how often do you repeat or rephrase? For a typical module the OU budgets that it should take a tutor 45 minute to mark an assessment. In my own tutor practice, I only ever manage to achieve this when I’m thorough ‘warmed up’ and I have managed to ‘get into the head’ of module team through the tutor notes and guidance they have prepared. This means that there can be a difference between how much time the university expects we spend, and how much time we (as tutors) do actually spend marking assignment and composing feedback.

We were asked further questions, such as do students read, understand, and apply the feedback? Also, what techniques do you use to ensure that marking does not become over time-consuming?

One of the techniques that could help us is the idea of creating and using a comment bank; a document that we can use to store comments that we may reuse (and then customise). Another term for this that I’ve heard is a ‘copperplate’, which can be though of as a template. Comment banks can be used to capture feedback or feed forward comments, but their use should not replace personalisation.

A good example of feedforward personalisation is where a tutor takes a bit from a student’s essay, and offers an example of how it might be presented in another way. In some cases, it might be useful to follow up with a discussion afterwards. Examples of exemplars can be helpful.

Another practice is to use something that was called a ‘feedforward grid’ that offers examples and potential useful exercises (to understand what they need to do to improve) that are related to learning outcomes. The exercises that may be suggested are, of course, not compulsory and won’t be marked. The feedforward form could also be used to share examples too. It could also be something that could be prepared by a module team. A final point I noted down was, of course, that a grid is not a substitute for on script comments or eTMA comments.

A variation of this kind of grid is used on one of the modules that I tutor, the computing undergraduate and project module. The difference between what was suggested and what I use is that the tutor have a lot of scope to prepare their own comments.

Session 3: Steps to Holistic KSB Assessment

The third session, facilitated by Helen Sanson, STEM practice tutor, reflected a change of focus, and one that relates to my involvement with the degree apprenticeship programme. The overview of the session mentioned the apprenticeship End-Point Assessment (EPA), emphasising that it ‘is not a multiple-choice test; it is a demonstration of professional competence’, which is assessed through a professional conversation. The session was also described to ‘challenges the traditional "tick-box" culture and offers a rapid guide to holistic assessment for Practice Tutors’ which is something that really appealed, especially since I have been thoroughly tiring of the way that many interactions are necessarily mediated through forms.

Checklist vs Holistic

Practice tutors must ensure that apprentices capture evidence to meet a defined apprenticeship standard, which is defined using descriptions certain knowledge, skills and behaviours (KSBS). A point was made that evaluating the progress of an apprentice is more than just ticking off progress, of whether (or not) an apprentice has met the necessary criteria. A checklist approach can assess whether an apprentice has performed, rather than how well they performed.

During an EPA, the assessor gives two main outcomes: they pass (demonstrates their competence) or fails. The EPA should show what they can do, and know how to act. Whilst the KSBs can appear like ‘mastery statements’, the EPA should be thought of a holistic assessment, that looks at the bigger picture. An EPA is all about communication about real world scenarios.

Sharing steps

What follows is a simple framework that was shared, that I have vigorously paraphrased:

Step 1: Spot it – identify naturally occurring evidence. Prevent apprentices relying on academic work to evidence their KSBS, but instead focus on real-world evidence: witness statements, screenshots. Encourage apprentices to collect evidence from day one.

Step 2: Map it – apprentices should only submit evidence (to their e-portfolio) if it relates to a KSB. Do reflect on the following principles of assessment: validity, authenticity, sufficiency, currency and reliability (VASCR). Consider creating a document that shows ‘clustered contextual evidence’, or as a way to summarise evidence.

Step 3:Feed it back - A framework to help to think about how apprentices have developed in their practice. What?: Can the evidence be used to describe a workplace situation or problem (and what KSBs can be demonstrated)? So what?: What process was followed, what was done?. Now what?: What was the output? Has there been some personal or organisational benefit?

A phrase that I noted down towards the end was: ‘Assess the cake rather than the ingredients – to make sure they are ready within their careers’.

Session 4: KSBs and OU apprenticeship programmes: a panel discussion

This final session was different to the others. It was described as follows: ‘featuring a panel of faculty representatives from WELS, FBL and STEM, this session will unpack how KSBs underpin programme design, teaching, and assessment across apprenticeship pathways’. What followsis a set of brief headings which reflect some of the discussions, under which are some accompanying points.

What is the purpose of KSBs in supporting apprentice learning?

Knowledge skills and behaviours (KSBs) are things that apprentices must know.

Can be useful to bridge theory and practice.

They can help apprentices to understand where to develop skills and abilities.

Which KSBs are the most challenging to evidence?

Those that relate to work-based experiences, especially if the apprentice’s organisation is subject to significant amount of change.

What questions could explore the apprentice and employer’s understanding of the KSBs?

What can you tell us about what you have done?

What else have you done?

Can you give me an example?

What was the impact?

What did you learn from doing that?

What do you need to gather to show you have achieved this?

What are the next steps?

What does higher quality OTJ training looks like within your programme?

OTJ means Off the Job Time. It includes time allocated to academic study.

A challenge is to make sure that apprentices record their hours, and make a note of how they have used their time.

In some cases, the employer may allocate time for the attending of conferences or events that relate to the role.

How can university and employers work together to support KSB progression?

It is necessary to interpret what KSBs mean in the context of a particular role. It is also necessary to relate them to the roles within the sector in which they are working.

Think of the KSBs as a guide.

Reflections

I remember the time when there used to be large cross-faculty events that were organised from a regional centre. Along with a nice lunch, meeting fellow associate lecturers was one of the motivations to go to these events. Not only would they have cross-faculty sessions, but there would also be faculty specific meetings. There would also be these informal coffee-point chats, where you may share experiences and pick up on new ideas whilst chatting with colleagues of a coffee and a biscuit.

I feel that these online cross faculty events work well for the university, but less so for the delegates. I find online tiring, but also appreciate that online is accessible, as well as being cost effective. The sessions I attended were quite short, and sometimes felt very formal. In my experience, online always takes longer (but, of course, it does depend on what is done, and how everything is facilitated).

In the sessions I attended there was very little opportunity to chat. Discussions, of course, work better, if you know more about those who you are speaking with, and a big variable is, of course, the experience and confidence of the facilitator. When I used to facilitate sessions, I was always really aware of there being a lot of experience in the room. More often than not, there were colleagues who were even more experienced than I was. I kind of felt that it was my task to find those people, and to get them talking.

Due to the level of interactivity being very attenuated, the objective of fostering a community of practice was a long way from being met. I didn’t recognise many of the names of the colleagues who were also attending (other than, of course, a very nice surprise of noticing someone who I haven’t seen for a good few years).

I’ve always asserted that the most important thing in education is, of course, people. When it comes to academic professional development, there is always a risk that technology can acts as a barrier to communication, just as much as it always has the potential to bring people together. A panel discussion without questions isn’t a panel discussion; it is a presentation.

There were some positives: the sessions about feedback helped me to reflect on my practices, and the references to the academic articles was welcome. From the practice tuition perspective, the critical approach that was taken in the third session was appreciated. It also offered a useful reminder about the purpose and significance of the end point assessment.

Acknowledgements

Many thanks to the ALSPD team who set up the event, and all the facilitators who worked hard to both design and run the sessions. 

Permalink Add your comment
Share post
Christopher Douce

Apprentice End Point Assessment (EPA) workshop

Visible to anyone in the world

On 10 December 2025, I attended a short online workshop to help OU Digital Technology Solutions (DTS) degree apprentices become familiar with what was required for their End Point Assessment (EPA).

What follows are a set of notes I’ve taken during the session which I’m sharing on the off chance they might be useful for any of the apprentices I’m supporting.  I’ve also taken a few moments to share my own practical tips, which I hope are helpful. I’ve written it as if I was speaking with an apprentice (which reflects the workshop).

A professional discussion

The EPA is what is called a professional discussion. It is a formal assessment to determine your “occupational competence” but should also be considered as “a celebration of your apprenticeship journey”.  It is a conversation about all the experience and learning gained from your apprenticeship, drawing on evidence that have been uploaded to your ePortfolio. The evidence will, of course, demonstrates your meet all the knowledge skills and behaviours (KSBs) that all combine together to form the DTS standard.

The EPA is expected to take 60 minutes, and is likely to contain 4 key critical questions (which are related to themes). Each question is likely to lead to follow up questions. The first question is likely to be quite broad. A practical recommendation is to give clear examples that relates to the evidence that you have uploaded and the KSBs.

Your portfolio

The evidence that you upload to your ePortfolio must be your own evidence; it must relate to work-based activities that you have done yourself, and the learning that you have gained from that work.

Every piece of evidence that you add must relate to one or more of the standard’s KSBs. A practical recommendation is that every piece of evidence should ideally relate to a group of KSBs. Minimum of 6 piece of evidence, but typically about 10 pieces of evidence. Evidence could take the form of module assignments (tutor marked assessments), information about work based products, narrative summaries of work down (with screenshots), witness testimonies, and even video materials.

How do you make decisions about what to include into the ePortfolio? An important question to ask is “what is a particular piece of evidence trying to achieve?” Two accompany questions are: what does it show, and how does it relate to the KSBs? Also, does the piece of evidence have a clear filename and title? Is it well structured? Does it show clear evidence of learning and development having taken place?

To help everyone to prepare evidence, we were introduced to something called the STAR method, a simple framework that uses four words to encourage reflection. The words are: situation (what is the context in which something was done?), task (why was it needed?), action (what did you do?), and result (what was the outcome or impact?).

After a piece of evidence is submitted to the ePortfolio, your practice tutor reviews what has been submitted, and assigns it a grade. There are two possibilities: pass, or distinction. The criteria for each of these is described in the DTS standard. What typically distinguishes between a pass and a distinction is evidence of impact. One clear and direct way to evidence impact is though numbers. If you have made some fixes to software, how many users does this positively impact? If there have been some efficiency savings, what are these? Numbers represent a really powerful and concise way to evidence impact.

Useful tips

When it comes to preparing and writing evidence that you upload into your ePortfolio, it is important not to leave it to the very last minute. When you begin to contribute to your workplace, begin to evidence what you do and the impact you have, as soon as you can. When you get to your third year, you may well have forgotten about some of the good stuff that you have done in the first six months of your apprenticeship.

My own practical tip is: if you have difficulty writing or preparing evidence, do consider preparing a witness statement as a practical alternative. Speak with your line manager. Sometime your line manager will be able to offer a wider perspective about the work you are doing and the contributions you are making.

When it comes to your EPA, here are some simple and practical tips:

  • Make sure you know the contents of your ePortfolio. You may be asked about anything you have uploaded.
  • It is okay to have notes. Before your EPA, take a bit of time to prepare some notes to bring into your meeting. Write down examples of work that you are most proud of, and know how these examples relate to the KSBs.
  • If you are asked a really difficult question, it is okay to pause for a few moments to allow you to collect your thoughts. Equally, it is okay to ask the assessor some clarifying questions. It is, after all, a conversation.
  • Remember that you are approaching all this from a position of strength. You are the expert in your own ePortfolio and what you have done. The assessor is not there to trip you up; the assessor is there to be guided through a story of what you have achieved.

The project module

In my diary, this event was listed under the heading ‘TMXY476 workshop’. TMXY476 is the apprenticeship project module. The EPA is, of course, a professional discussion is about your entire apprenticeship full journey. By way of contrast, the project is a “deep dive” into skills, and has its own set of KSBs. Your project and work-based learning tutors will help you to work through these. During your project you may, of course, carry out some activities that can also evidence some of your apprenticeship KSBs.

Reflections

When I started as a practice tutor it took me quite a bit of time to understand what all the KSBs were all about. It is impossible to understand them all in one go. The only way to do it is to gradually chip away at it, and to understand different ways they can be related to what you do.

In the middle of all the work activities and the academic study, it is easy to forget about them, but it is important to keep clear sight of them. In one way or another, they should guide what you do, and also help you to relate the academic study to the industrial work. I think of them as a bridge between the two.

From my perspective, there are two significant take away points. The first is the question “what is a particular piece of evidence trying to achieve?” Clarity is important. It helps the discussion. Both the EPA and your ePortfolio are both about showing off, and celebrating what you do. The STAR framework looks like a really useful tool that can help with this.

Acknowledgements

The event was facilitated by Martin Rothwell and was attended by OU colleagues, and apprentices.  Words and phrases in quotes have been noted directly from Martin’s presentation. Many thanks to other OU colleagues in the apprenticeship team who may have contributed to this helpful workshop.

Permalink Add your comment
Share post
Christopher Douce

Practice tutor spotlight session: Tripartite meeting fundamentals

Visible to anyone in the world
Edited by Christopher Douce, Thursday 6 November 2025 at 13:53

Another day, another CPD session. In addition to being a tutor, I’m also what is called a practice tutor (PT), where I help to support the school’s apprenticeship programme. On 6 November 2025 I went along to a relatively short session about something called the tripartite apprenticeship meeting. It is called this, since it involves, perhaps unsurprisingly, three parties: the apprentice, their line manager, and the OU practice tutor.

What follows are some notes that reflects both the session, and the role of the practice tutor.

Session summary

During the initial meeting, there is an intention of sharing information about roles and responsibilities. It is an opportunity for the PT to gain an understanding of apprentice’s starting point, and to set initial targets. It is also an important opportunity to offer a useful summary of the apprenticeship programme to both the apprentice and their line manager. It is also important to introduce the Knowledge Skills and Behaviours (KSBs) which are integral to an apprenticeship standard.

All the other tripartite meetings will be progress meetings with a priority on making sure the apprentice feels supported. A key objective is to set discuss progress regarding existing objectives, set new objectives, and to address any issues that may have arisen. Like with the first meeting, an important focus is the KSBs, and finding practical ways to evidence their attainment.

During the session, we were asked to reflect on:

  • What steps did we follow to prepare for our meeting?
  • How did I manage the timing and organisation of the meeting?
  • What was my approach to supporting goal setting (and how they relate to KSBs)?
  • What might I do differently next time?

We didn’t really get to discuss our reflections in depth, but there was some sharing of views. One useful tip that I picked up was: do consider sending an apprentice example of goals in advance of a meeting.

We then moved onto the importance of goal setting, which use SMART goals. This is, of course, an abbreviation for: specific, measurable, achievable, relevant, and time bound. When it comes to measurable, this could relate to the completion of a certain number of TMAs by the next review, or uploading certain elements of evidence.

The Skills Scan was mentioned. This is a detailed questionnaire, which should be completed every 6 months, that is used to identify gaps in the apprentice’s KSBs. In turn, the results from the Skills Scan can be used to help to guide the creation of goals.

SMART goal targets can be short term, or long term. A short term goal might be as simple as completing a Skills Scan. A long term goal might be meeting career goals and necessary regulatory standards. They can also be skills based, certification based, professional development, time management, or even team working.

Reflections

I am really very familiar with the notion of SMART goals. By attending this session, I realised that I wasn’t applying the SMART framework as rigorously as I could have been. I found it helpful to think of them in terms of short-term and long-term goals.

In advance of a tripartite meeting, one of the things I always do is I carefully review all the records that I have about an apprentice. It takes me a bit of time: I look at where they are in the programme and what their most recent TMA scores have been. If available, I also have a look at a completed Skills Scan questionnaire that is available through the e-portfolio tool that we used. I also will look at any evidence that has been submitted, to determine whether there is anything I need to sign off. What I really need to do more of is to review the previous objectives that have been set.

I also feel that the objectives need to speak more directly to any gaps that I see, and also the characteristics of the degree apprenticeship programme.

Acknowledgements

Many thanks to the facilitators Jennifier Hillman and Kelly Guilfoy.

Permalink
Share post

This blog might contain posts that are only visible to logged-in users, or where only logged-in users can comment. If you have an account on the system, please log in for full access.

Total visits to this blog: 3651682