OU blog

Personal Blogs

Henry James Robinson

Overcoming the barriers to learning analytics implementation

Visible to anyone in the world
Edited by Henry James Robinson, Sunday, 26 Jul 2020, 06:11


image source: Jisc Learning analytics going live

Overcoming the barriers to learning analytics implementation

The following is my reaction to an H817 activity where we are asked to create a short report on implementing a policy of learning analytics to improve teaching and learning. The request was for help developing a plan for rolling out learning analytics, either across the institution or across one section of the institution. My response is an elucidation of what institutional strategy would be involved in implementing a change of this sort, which departments would be involved in and/or affected by and what changes might be required to help ensure success. The stated purpose was a basis for the claim “learning and teaching at this institution are supported by learning analytics”, which the university could place on its website for promotional purposes. 

The following is my response to your request, based on my know-how about implementing technological innovations of this kind; the accompanying institutional change policies that have been found to be necessary in these cases and current TEL research.

Overall approach or strategy
The first point to note is how institutional change, whether within one department or across the whole institution is still on that requires looking at the whole workings of that institution. This is because, as Scanlon et al. (2013, 28) argue, “TEL should be considered as… made up of a series of interconnected elements that cannot be changed in isolation…centred on a vision of educational change.” The management proposed that they will be able to claim on their website that ‘learning and teaching at this institution are supported by learning analytics’ on that basis is not a viable vision as it is not one of educational change. The same authors propose that “TEL innovation should be…design-based research” (p.28).  A more appropriate vision statement would reflect a research-driven approach that considered the full context in which the innovation takes place. Given that, the vision statement would recognise the growing evidence that learning analytics can be a basis for improved quality and sustainability in education, and the need to involve all stakeholders for its potential benefit to be realised.


Which barriers will need to be overcome?
Some of the biggest barriers to TEL innovation and more specifically to learning analytics were identified (for e.g.) by Macfadyen and Dawson (2012); Mosadeghrad et al (2012); Ferguson et al (2014).  Ferguson et al identifies one of the challenges as complexity:  the need to work with learners, educators, administrators and support staff in ways that suit “the practices of those groups, their understandings of how teaching and learning take place, the technologies they use and the specific environments within which they operate”. Ferguson et al. note how this requires “explicit and careful consideration during the process of implementation, in order to avoid failure and maximise the chances of success”. Mosadeghrad et al (2012) are in accord with this viewpoint. They cite poor management, deficient leadership, and lack of strategic planning (p. 193).

The cultural barriers identified in Macfadyen and Dawson (2012) can be categorised as the education’s resistance to innovation and change in and the tendency of educational organizations to add resources rather than strategically allocate them; its tendency to assume learner homogeneity, rather than to explore diversity.  Macfadyen and Dawson (2012) identify "strategic barriers, structural barriers, human resources barriers, contextual barriers, and procedural barriers" (p.193). These include resistance to change that impinges on faculty autonomy, especially if it is perceived to derive from the “cost-consciousness-and-efficiency” as well as unwillingness to accept the extra workload learning how to use complex new tools.

Not exclusive of any of the above cultural, operational and strategic, I also wish to list the psychological domain.  Freud’s theories has been applied to the workplace.  In particular, his theory of psychogenic disturbance of vision could be applied to sentiments like:  'It could negatively affect me'; 'Does it fit with how I do things now?'; 'Does it help me?'; 'It will make me look bad'; 'I get nothing out of this' which from personal experience I can attribute to reactions to forthcoming or proposed institutional changes.


References:
Ferguson, R., Clow, D., Macfadyen, L., Essa, A., Dawson, S. and Alexander, S., 2014, March. Setting learning analytics in context: Overcoming the barriers to large-scale adoption. In Proceedings of the Fourth International Conference on Learning Analytics And Knowledge (pp. 251-253). [Online]. Available at: https://dl.acm.org/doi/abs/10.1145/2567574.2567592 Accessed July 23 2020
Freud, Sigmund. (1910). The psychoanalytic view of psychogenic disturbance of vision. SE, 11: 209-218.
Macfadyen, L.P. and Dawson, S. (2012) ‘Numbers are not enough. Why e-learning analytics failed to inform an institutional strategic plan’, Educational Technology & Society, vol. 15, no. 3, pp. 149–63 [Online]. Available at https://www.open.ac.uk/ libraryservices/ resource/ article:106516&f=28635 (Accessed 19 July 2020).
Mosadeghrad, Ali & Ansarian, Maryam. (2014). Why do organisational change programmes fail?. International Journal of Strategic Change Management. 5. 189. 10.1504/IJSCM.2014.064460. [Online]. Available at https://www.researchgate.net/publication/275590169_Why_do_organisational_change_programmes_fail/citation/download (Accessed 19 July 2020).

Permalink Add your comment
Share post
Henry James Robinson

Review of 'Current State and Future Trends: A Citation Network Analysis of the Learning Analytics Field'

Visible to anyone in the world
Edited by Henry James Robinson, Tuesday, 21 Jul 2020, 18:00


We were asked to review this article in order to expand our understanding of social learning analytics.  Rather than examining interactions by students in an online learning environment such as an institutional LMS, however, or another trail of start-up, demographic, disciplinary, course, log in, data, commonly analyzed, we focussed on research network analytics of some of the leaders in the field.  A very different subject to what we had been looking at previously.   Another usual thing was the way we approached the reading. We begin by looking at the abstract of this paper:  Dawson et al. (2014), 'Current state and future trends: a citation network analysis of the learning analytics field' and noted the aims of the paper. That was the end of the conventional approach to reading, as we then were asked to skip to the fourth section of the paper that listed its practical implications (Section 4.3). These were that the analysis ... 

  • provides an understanding of how key papers, thematics, and authors influencing a field emerge 

  • raises awareness about the structure and attributes of knowledge in a discipline and the development of curriculum in the growing number of academic programs that include learning analytics as a topic  

  • promotes under-represented groups and research methods to the learning analytics community 

  • fosters the development of empirical work and decreased reliance on founding, overview and conceptual papers  

  • improves connections to sister organizations such as the International Educational Data Mining Society 

(Dawson et al 2014, 238)

The interesting thing about the paper was its crossover between an example of learning analytics and a paper about learning analytics.  

Some of its figures and tables thankfully were understandable even for a non-expert.  Table 1 identifies the ten most-cited papers in the field.  Interestingly, the numbers of citations in the learning analytics literature ranged between 10 and 16, whilst the Google Scholar citation counts vastly differed. This can be attributed to the equal currency placed on both old and new publications among specialist members of the field compared with researchers from a much wider range of fields and interests among the Google scholar audience. 

In sum, the article is a reminder of how much more complex the learning analytics landscape is than a means to improve teaching and learning. In this case, it was used to aid in a complex understanding of how research gains prominence. A systemic and integrated response is required for the approach to do justice to its subject. As the authors note: 'while it is helpful to note that (more active) students...perform better than their less active peers, this information is not suitable for developing a focused response to poor-performing students. (p. 231)'  

A  more in-depth reading of the article would certainly have made the basis of that point much clearer to the reader. However, what I did gain from reading a paper in this way was an impression I could take with me to other readings and to my general knowledge of the breadth of the learning analytics field. 

Dawson, S., Gašević, D., Siemens, G. and Joksimovic, S. (2014) Current state and future trends: A citation network analysis of the learning analytics field. In Proceedings of the fourth international conference on learning analytics and knowledge (pp. 231-240) [Online]. Available at: file:///C:/Users/robin/Desktop/Current%20State%20and%20Future%20Trends%20A%20Citation%20Network%20Analysis%20of%20the%20Learning%20Analytics%20Field.pdf (Accessed July 20 2020).

Permalink Add your comment
Share post
Henry James Robinson

Student and Teacher data to improve learning

Visible to anyone in the world
Edited by Henry James Robinson, Saturday, 11 Jul 2020, 17:23


image source: jisc.ac.uk

My last two blogs try to summarise my views and knowledge of ways Big data are used in learning and teaching (where it can be referred to as educational learning analytics) and ethical practices related to it. 

The following tries to highlights the relatedness but also the differences between learning analytics (LA) that is learner/teacher focussed and when it is more informs the wider body of educators such as managers, administrators, the institution, government, other funding bodies. Also, the reasons why LA emerged in the 2000s.

Reasons for the emergence

Campbell et al. (2007) and Norris et al. (2009) examine the need for analytics from the viewpoint of the USA, where at the time of writing, they suggest that levels of degree qualified workers were falling overall as well as the quality of learning.  In the UK in 2020 (Guardian, 2020) 'nearly three-quarters of the country's universities slipped down the rankings in the UK's worst-ever performance in the table compiled by data and research group QS.'  Studies had long since shown a steady decline in higher education standards in the UK since the 90s (Cameron, and Smart,1998). Analytics offers a way of 'responding to internal and external pressures for accountability in higher education (and) improved learning outcomes and student success' (Campbell et al., 2007).  I should be noted, however, that since the turn of the century, the UK has made big strides in Maths, Science, and Reading and currently stood in the top 20 in these areas in the OECD (2018) PISA standings, which it should be noted, only refers to school learning outcomes.

The US has also seen a sharp improvement in school results over the same period and it could in part be put down to the increase and improvement in learning analytics initiatives and theory.

Used to benefit educators

Institutions are of course very interested in student performance because it reflects on the institution and its popularity and on their funding and staff jobs. They are trying to reach certain external and internal key indicators. For those reasons, they are also interested in things that are not about the individual, rather, they are interested in increasing student numbers, administrative, and academic productivity and cost-cutting for profitability and data related to this, a focus is known as academic analytics. They are interested in general account analytics, which allows them to see what students, teachers are doing within the account.  Activity by date allows the admin to view student participation in Assignments, Modules, Discussions, and teachers' completion of Grades, Files, Collaborations, Announcements, Groups, Conferences, etc  In general one can view how the users are interacting with the courses in the term. This means the content can be adapted to improve efficiency, productivity, and adherence to policy and practice based on deductions and predictions made from the use of content.

Used to benefit students

I categorize all the above (under benefitting the educator) as also benefitting the learner. The above would reach the learner via tutorials the teacher has with them and reports if any.  However, the analytics that the student is likely to actually access mainly include only their grades and records of their assignments.  Students can use these to track and assess their progress, this way they can see where they need to improve as they go along. Teachers should use educational data mining (EDM) - 'analysis of logs of student-computer interaction' (Ferguson, 2012) to improve learning and teaching. Romero and Ventura (2007 cited in Ferguson, 2012), identified the goal of EDM as ‘turning learners into effective better learners’ by evaluating the learning process, preferably alongside and in collaboration with learners. These data are often only available if the teacher makes them viewable.

In my situation, the LMS used is compatible with various apps such as Turnitin (a plagiarism checker), students also have access to its analytics, used at the drafting phase of the writing course, and at the end of the course only if students query their report writing score and plagiarism has some relevance. 

Challenges

Some of the educational challenges in the environment that I work in include adapting to online as opposed to f2f teaching. A way to sum up the challenge is it is completely different because the contact and communication have technology running through it.  One of the challenges involved in implementing learning analytics is mistrust of how data will be used. Students I work with sometimes avoid leaving digital traces for fear of it ending up being a means of covertly 'assessing' them. This is why their focus is on final tests, where they are fully aware of what performance data will be collected and how it will be used. 'Therefore,  it is necessary to make the goals of the LA  initiative transparent,  clarifying exactly what is going to happen with the information and explicitly' (Leitner et al. 2019, p.5).  Researchers also point to privacy and ethical issues. 

Recommendations 

1. Teachers should get training in the use of data analytics for use in the classroom

2. There should be an open dialogue about  what learners' rights to their own learning analytics should be; what learning analytics should be available to them; how to give them access (including training in accessing their own data) 

3. Analytics is too much management centred - data-mining and academic analytics and its often not shared with teachers. Learning analytics need to be much more classroom and teacher/student relationship centred. 

I think the adoption of these recommendations could improve engagement, ownership, motivation to learn better, and also improve learning directly.

References

Cameron, K. and Smart, J., (1998) Maintaining effectiveness amid downsizing and decline in institutions of higher education. Research in Higher Education, 39(1), pp.65-86.Cameron, K. and Smart, J., 1998. Maintaining effectiveness amid downsizing and decline in institutions of higher education. Research in Higher Education, 39(1), pp.65-86.

Campbell, J.P., DeBlois, P.B. and Oblinger, D.G. (2007) ‘Academic analytics: a new tool for a new era’, Educause Review, vol. 42, no. 4, pp. 40–57 [Online]. Available at http://www.educause.edu/ ero/ article/ academic-analytics-new-tool-new-era (Accessed 11 July 2020).
Ferguson, R. (2012) ‘Learning analytics: drivers, developments and challenges’, International Journal of Technology Enhanced Learning (IJTEL), vol. 4, nos. 5/6, pp. 304–17 [Online]. Available at http://oro.open.ac.uk/ 36374/ (Accessed 11 July 2020).
Norris, D., Baer, L. and Offerman, M. (2009) ‘A national agenda for action analytics’, paper presented at the National Symposium on Action Analytics, 21–23 September 2009, St Paul, Minnesota, USA [Online]. Available at http://lindabaer.efoliomn.com/ uploads/ settinganationalagendaforactionanalytics101509.pdf (Accessed 11 July 2020).
OECD (2018) PISA 2018 Results Combined executive summaries [Online]. Available at 2019). https://www.oecd.org/pisa/Combined_Executive_Summaries_PISA_2018.pdf (Accessed 11 July 2020).
Romero, C. and Ventura, S. (2007) ‘Educational data mining: a survey from 1995 to 2005’, Expert
Systems with Applications, Vol. 33, No. 1, pp.135–146.

Permalink Add your comment
Share post
Henry James Robinson

Learning analytics for teachers

Visible to anyone in the world
Edited by Henry James Robinson, Sunday, 2 Aug 2020, 10:42

Learning Analtyics for Teachers
image: by pxfuel

Learning analytics in learning and teaching

Huh?
For your average overworked underpaid member of the teaching staff, who isn't a statistics buff, (but ought by now to be realizing the huge part computers can play and have played in our careers, whether in the foreground or background) learning analytics is something you may hope stays in the background. Something for the middle and senior academic staff and administrators, all the way up to education authorities and grants bodies to be involved in - not you.
Think again.  Learning analytics just hit me with a sock (with a cold bar of soap in it)!  It is time to wake up to the fact that COVID-19 has changed things. In an era when you don't have the expressions of your students' faces or the way your co-ordinator is stirring his coffee while he remotely observes your class to go on, every advantage that a new laptop and a TEL course can give you is worth it. And education has been screaming about learning analytics for ages.  It's time to retool if you are going to keep your finger on the pulse of your teaching effectiveness, the satisfaction levels of your clients, and your fast -becoming-obsolete career. And I know very little about LA and the different forms of data analytics relevant to education out there. So, we need a definition. However, we probably need to first acknowledge how it is becoming such a prominent force in education and why teachers need to get a handle on it.


Why is it?
The Internet and also every type of technological device in our everyday lives have now become constant, explicit, and quite unstoppable transmitters of our every action, thought and utterance because we have taken it for granted and implicitly accepted that our personal information is open for use by commercial, governmental and other entities unless we purposely and systematically opt-out in most cases. All digital devices leave a trail, a digital footprint. Education is one of the few areas where that open data can have the least malign effect on our lives, however.  Few people doubt the value of data that is used to enhance our educational experiences, though it is wise to remember that in the current global economic environment, education institutions are much more commercialized. Nevertheless, for the teacher, 'these learner-produced data trails can provide ...valuable insight into the learning process' (Long and Siemens, 2011).  Teachers who find ways to get access to it, and gain permission to use it to make improvements are at the very least impressing learners and all other stakeholders of their investment in the idea of improving educational outcomes.


What is it?
Now for some definitions. As recently ago as 2010, non-specialists in data analytics were going onto various sites and platforms such as Wikipedia with simple definitions of learning analytics, such as  'the use of data and models to predict student progress and performance, and the ability to act on that information'. (Blackall, 2010).  Which sounds a bit naive.  What if they do not know how to 'act on it', never mind analyze it? Does it stop being learning analytics?  A criticism of this definition posited by Siemens (2010) was its implied limitation to extrapolating trends.  Could it also be used to transform learning outside of the 'box'?  Of course, this is when in education earning analytic was still note being utilized to anywhere near its potential, which changed as software and hardware and theories developed.


The current Wikipedia definition, one which stands back from any assumed use of analytics and the analytical capabilities of its collectors, is: 'Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.' 
Since 2010, theorists have separated areas of education data analysis into several areas: educational data mining, academic analytics and learning analytics. All of these areas utilize similar or the same data - just for different purposes and/or from different perspectives.


The area of most concern for the teacher is learning analytics because it focuses on the learning process and the relationship between the learner and educator, what is being learned, and the learner's perception of the institution in which they learn.
Adopting the teacher's perspective, then, I've simply adapted learner focussed definitions that I've read into the following:


Learning analytics is digital data research, aimed at enhancing the student learning experience, looked at from both the teaching and learning effectiveness perspectives.

Long and Seimen (2011, p.36) put forward the following cycle, which is useful for focusing on the levels of learning that goes on in that three-way relationship:


1. Course-level: learning trails, social network analysis, discourse analysis
2. Educational data mining: predictive modelling, clustering, pattern mining
3. Intelligent curriculum: the development of semantically defined curricular resources
4. Adaptive content: adaptive sequence of content based on learner behaviour, recommender systems
5. Adaptive learning: the adaptive learning process (social interactions, learning activity, learner support, not only content).


Whatever tool is used for learning, a data trial is left and many tools (e.g. learning management systems, virtual learning environments; communication collaborative software have analytics facilities built-in for use by their owners but are seldom exploited. 


Back to why in teaching/learning
Learning analytics is only really in its early stages of implementation and experimentation by teachers and learners. There are concerns about its use by teachers: at what age can learners give consent. What about issues of privacy, the teacher's undue influence on students' decisions concerning their privacy and profiling, data security and (yes) the ability of the teacher to exploit in a positive way the potential value of the data without succumbing to 'deterministic modelling'.  There are dangers, but whatever happened to the classroom as a place to experiment and carry out action research?  Should big data only be the sole remit of business managers concerned with profitability, long term financial sustainability and meeting so-called performance benchmarks? When used for those purposes learning analytics often becomes just a means of predicting market behaviours and where economies can be made. Learning is multi-dimensional and that's why data has to be utilized at every level and sphere of the education process, where different types of data, not simply behavioural, are examined. Sharing that information with learners, where they are able to gain another form of feedback on their performance and comparison of different learning methods can be highly motivational and improve teacher and learner efficacy.


References

1st International Conference on Learning Analytics and Knowledge, Banff, Alberta, February 27–March 1, 2011. [Online]. Available at: https://dl.acm.org/doi/proceedings/10.1145/2090116 (Accessed 05 July 2020).

pxfuel (2020) 'person typing'  [Online]. Available at: https://www.pxfuel.com/en/free-photo-jrtyv (Accessed 5 July, 2020).

Siemens, G. and Long, P., 2011. Penetrating the fog: Analytics in learning and education. EDUCAUSE review, 46(5), p.30. [Online]. Available at: https://eric.ed.gov/?id=EJ950794 (Accessed 05 July 2020).

Permalink Add your comment
Share post

This blog might contain posts that are only visible to logged-in users, or where only logged-in users can comment. If you have an account on the system, please log in for full access.

Total visits to this blog: 83071