OU blog

Personal Blogs

Henry James Robinson

Student and Teacher data to improve learning

Visible to anyone in the world
Edited by Henry James Robinson, Saturday, 11 July 2020, 17:23


image source: jisc.ac.uk

My last two blogs try to summarise my views and knowledge of ways Big data are used in learning and teaching (where it can be referred to as educational learning analytics) and ethical practices related to it. 

The following tries to highlights the relatedness but also the differences between learning analytics (LA) that is learner/teacher focussed and when it is more informs the wider body of educators such as managers, administrators, the institution, government, other funding bodies. Also, the reasons why LA emerged in the 2000s.

Reasons for the emergence

Campbell et al. (2007) and Norris et al. (2009) examine the need for analytics from the viewpoint of the USA, where at the time of writing, they suggest that levels of degree qualified workers were falling overall as well as the quality of learning.  In the UK in 2020 (Guardian, 2020) 'nearly three-quarters of the country's universities slipped down the rankings in the UK's worst-ever performance in the table compiled by data and research group QS.'  Studies had long since shown a steady decline in higher education standards in the UK since the 90s (Cameron, and Smart,1998). Analytics offers a way of 'responding to internal and external pressures for accountability in higher education (and) improved learning outcomes and student success' (Campbell et al., 2007).  I should be noted, however, that since the turn of the century, the UK has made big strides in Maths, Science, and Reading and currently stood in the top 20 in these areas in the OECD (2018) PISA standings, which it should be noted, only refers to school learning outcomes.

The US has also seen a sharp improvement in school results over the same period and it could in part be put down to the increase and improvement in learning analytics initiatives and theory.

Used to benefit educators

Institutions are of course very interested in student performance because it reflects on the institution and its popularity and on their funding and staff jobs. They are trying to reach certain external and internal key indicators. For those reasons, they are also interested in things that are not about the individual, rather, they are interested in increasing student numbers, administrative, and academic productivity and cost-cutting for profitability and data related to this, a focus is known as academic analytics. They are interested in general account analytics, which allows them to see what students, teachers are doing within the account.  Activity by date allows the admin to view student participation in Assignments, Modules, Discussions, and teachers' completion of Grades, Files, Collaborations, Announcements, Groups, Conferences, etc  In general one can view how the users are interacting with the courses in the term. This means the content can be adapted to improve efficiency, productivity, and adherence to policy and practice based on deductions and predictions made from the use of content.

Used to benefit students

I categorize all the above (under benefitting the educator) as also benefitting the learner. The above would reach the learner via tutorials the teacher has with them and reports if any.  However, the analytics that the student is likely to actually access mainly include only their grades and records of their assignments.  Students can use these to track and assess their progress, this way they can see where they need to improve as they go along. Teachers should use educational data mining (EDM) - 'analysis of logs of student-computer interaction' (Ferguson, 2012) to improve learning and teaching. Romero and Ventura (2007 cited in Ferguson, 2012), identified the goal of EDM as ‘turning learners into effective better learners’ by evaluating the learning process, preferably alongside and in collaboration with learners. These data are often only available if the teacher makes them viewable.

In my situation, the LMS used is compatible with various apps such as Turnitin (a plagiarism checker), students also have access to its analytics, used at the drafting phase of the writing course, and at the end of the course only if students query their report writing score and plagiarism has some relevance. 

Challenges

Some of the educational challenges in the environment that I work in include adapting to online as opposed to f2f teaching. A way to sum up the challenge is it is completely different because the contact and communication have technology running through it.  One of the challenges involved in implementing learning analytics is mistrust of how data will be used. Students I work with sometimes avoid leaving digital traces for fear of it ending up being a means of covertly 'assessing' them. This is why their focus is on final tests, where they are fully aware of what performance data will be collected and how it will be used. 'Therefore,  it is necessary to make the goals of the LA  initiative transparent,  clarifying exactly what is going to happen with the information and explicitly' (Leitner et al. 2019, p.5).  Researchers also point to privacy and ethical issues. 

Recommendations 

1. Teachers should get training in the use of data analytics for use in the classroom

2. There should be an open dialogue about  what learners' rights to their own learning analytics should be; what learning analytics should be available to them; how to give them access (including training in accessing their own data) 

3. Analytics is too much management centred - data-mining and academic analytics and its often not shared with teachers. Learning analytics need to be much more classroom and teacher/student relationship centred. 

I think the adoption of these recommendations could improve engagement, ownership, motivation to learn better, and also improve learning directly.

References

Cameron, K. and Smart, J., (1998) Maintaining effectiveness amid downsizing and decline in institutions of higher education. Research in Higher Education, 39(1), pp.65-86.Cameron, K. and Smart, J., 1998. Maintaining effectiveness amid downsizing and decline in institutions of higher education. Research in Higher Education, 39(1), pp.65-86.

Campbell, J.P., DeBlois, P.B. and Oblinger, D.G. (2007) ‘Academic analytics: a new tool for a new era’, Educause Review, vol. 42, no. 4, pp. 40–57 [Online]. Available at http://www.educause.edu/ ero/ article/ academic-analytics-new-tool-new-era (Accessed 11 July 2020).
Ferguson, R. (2012) ‘Learning analytics: drivers, developments and challenges’, International Journal of Technology Enhanced Learning (IJTEL), vol. 4, nos. 5/6, pp. 304–17 [Online]. Available at http://oro.open.ac.uk/ 36374/ (Accessed 11 July 2020).
Norris, D., Baer, L. and Offerman, M. (2009) ‘A national agenda for action analytics’, paper presented at the National Symposium on Action Analytics, 21–23 September 2009, St Paul, Minnesota, USA [Online]. Available at http://lindabaer.efoliomn.com/ uploads/ settinganationalagendaforactionanalytics101509.pdf (Accessed 11 July 2020).
OECD (2018) PISA 2018 Results Combined executive summaries [Online]. Available at 2019). https://www.oecd.org/pisa/Combined_Executive_Summaries_PISA_2018.pdf (Accessed 11 July 2020).
Romero, C. and Ventura, S. (2007) ‘Educational data mining: a survey from 1995 to 2005’, Expert
Systems with Applications, Vol. 33, No. 1, pp.135–146.

Permalink Add your comment
Share post

This blog might contain posts that are only visible to logged-in users, or where only logged-in users can comment. If you have an account on the system, please log in for full access.

Total visits to this blog: 100146