OU blog

Personal Blogs

H809: Activity 12.8 & 12.9 Reading 14

Visible to anyone in the world

H809: Activity 12.8/9: Reading Paper 14 (2 hours)

Read the paper by Cox (2007). When making notes, think about the following questions.

  1. Clearly, as Cox says, 'It is relatively easy to collect voluminous amounts of process data', but what else has been added here?

Technology enhanced research (TER) allows the modelling of the process to determine the best points and forms of intervention. A framework devised by Chi (1997) has been suggested in order to determine granularity of the units they are analysing and then develop a coding scheme based on this.

  1. What do you think he means by 'triangulated' and 'multiple data source approaches'?

Triangulation is a method used by qualitative researchers to check and establish validity in their studies.

Data triangulation: A key strategy is to categorise each stakeholder in your study and include a comparable number of people from each group.

Investigator triangulation: several investigators used with each using same method, findings are compared, if similar than validity established.

Theory triangulation: Multiple perspectives i.e. from different disciplines or different theoretical positions within disciplines.

Methodological triangulation: multiple qualitative and/or quantitative methods.

Environmental triangulation: different locations, settings and other key factors.

[From: http://www.rayman-bacchus.net/uploads/documents/Triangulation.pdf Accessed 2nd May 2011]

I thought Cox (2007)  had a slightly different interpretation of triangulation than one I had come across before. He describes his approach as data triangulation as the data is retrieved from a variety of sources but I would have considered this methodological triangulation as multiple methodologies are required to collect the data. I suppose that the terminology does not really matter as the whole point of triangulation is to validate the data by approaching from various angles and compare the results to see if they are consistent.

 

 

I had a few problems reading this paper. I kept getting confused between granularity as used in this context; granularity as referred to in the IEEE/LOM; and the granularity used in data management for databases. All very similar but with enough overlap to confuse me!

My thoughts so far are as follows:

Learner performance is a very high level of granularity.
Learner performance on one exercise is a lower level of granularity
Learner performance on one aspect of an exercise is an even lower level of granularity

I think that Cox was saying that a researcher needs to determine the lowest level of granularity required to answer the research question and then use this as a basic unit in order to model the process using data gleaned from various methodologies..

Is this what others understood from the paper?

 

 

 

Permalink Add your comment
Share post

This blog might contain posts that are only visible to logged-in users, or where only logged-in users can comment. If you have an account on the system, please log in for full access.

Total visits to this blog: 463527