OU blog

Personal Blogs

New photo

Deploying a vision of learning analytics: Activity 23 Block 4

Visible to anyone in the world

Deploying a vision of learning analytics: Activity 23 Block 4

·         Read about two implementations and make notes on how the visions that underpinned these impacted on the implementation of learning analytics. You may find that the vision is not clearly defined. If this is the case, state it as clearly as you can, based on the information that you have.

The 2 chosen implementations are:

·         Case Study 1A: The Open University Data Wranglers (Ferguson et. al. 2015:131ff.)

·         Cluster 2 – Australia (Colvin et.al. 2015:19ff)

The 2 publications have a totally different function and whereas Ferguson et. al. attempt to relate the efficacy of implementation strategy to ROMA, Colvin et. al. (in this general section) are much more interested in the concept of vision as a progenitor and /or emergent product of a project. Indeed, it is difficult not to see Cluster 2 as a representation of a ‘model’ form of LA implementation with Cluster 1 as its inadequate side-kick and foil.

Case Study 1A

As the first iteration of ROMA proceeded the goals of the project became (Steps 3 & 5) concerned with curriculum development and quality enhancement. Though apparently specified goals they have a very high level of generality and appeared to give report writers very little guidance (Step 7) about the expectations anyone had in particular about the function of this implementation. This may suggest an absence of vision, as does the statement:

There was no integrated, systematic view being developed to inform and enhance teaching and learning practice.

The general drive appears to be based on a supposition that a lot of unused data existed and that it might be a good idea to find out whether there was value in that data. The whole thing feels ‘data-driven’ therefore rather than driven by pedagogic purposes, other than a vague feeling that we could probably do this better.

This feels apparent in the name ‘Data-Wrangler’ given to operatives and to the fact that the motivation appears to have been driven entirely from a ‘technical context’ (the university’s ICT department).

It is unfair to expect an elaborate vision guiding pioneer studies and it certainly appears that later iterations of ROMA have enabled the Project to begin to bring together a vision it might originally have lacked – based in agreements between senior faculty managers and the Data-wranglers about the kind of reporting tools that might help them and in what timing schedule (p. 133).

The implicit learning in this Project, and perhaps standing as an emergent vision itself that justifies the endeavour post-hoc is the university learned that data might be useful in guiding the university’s pedagogy but not until the technical context was more thoroughly engaged with ecology of pedagogical practices throughout the university in Step 6.

Cluster 2

Although Colvin et.al, (2015:19) carefully confine themselves to describing ‘differences’ between the 2 Clusters in terms of three variables: concept, readiness & implementation, it is difficult not to see between the lines a perception that, relative to Cluster 2, Cluster 1 was deficient in all 3. I read these pages as a comparison between a model of a potential VISION for data analytic introduction and a deficit model.

It is not a bad start to see vision as a means of balancing variables like CONCEPT, READINESS & IMPLEMENTATION since they can easily equate with, or at least parallel, other ‘holy’ intellectual and research-based trinities like THEORY, REFLECTION & ACTION.

Whereas Cluster 1 appears to have started with a vague sense that that administrative ‘efficiency’ (‘reporting no or nascent strategy development’) could be improved by LA, this ‘vision’ was insufficient to bring together significant alliances or create or deploy leaders from its senior echelons. The opposite was true (it is claimed) of Cluster 2, where ‘efficiency was typically not mentioned by Cluster 2 institutions as a driver for LA’.

Although not stated as a conclusion supported by inferential statistics, vision is, it appears, very highly perceptible in alliances of stakeholders – especially pedagogic and technical: ‘in essence, communication, human resource and development processes and systems were in place for the commencement of implementation initiatives (p.20).’

Colvin, C., Rogers, T., Wade, A., Dawson, S., Gasevic, D., Buckingham Shum, S., Nelson, K., Alexander, S., Lockyer, L., Kennedy, G., Corri, L. and Fisher, J. (2015), Student Retention and Learning Analytics: A Snapshot of Australian Practices and a Framework for Advancement: Final Report 2016, Australian Government: Office for Learning and Teaching; also available online at http://he-analytics.com/ (accessed 15 June 2016).

 

Ferguson, R., Macfadyen, L.P., Clow, D., Tynan, B., Alexander, S., and Dawson, S. (2015) ‘Setting learning analytics in context: overcoming the barriers to large-scale adoption’, Journal of Learning Analytics, vol. 1, no. 3, pp. 120–44; also available online at http://oro.open.ac.uk/42115/ (accessed 15 June 2016).


Permalink Add your comment
Share post