For this activity I had to select two implementations of learning analytics and apply to the ROMA framework.
- Ferguson et al,. (2015), Setting learning analytics in context: – case study 2;
- Colvin et al. (2015), Student retention and learning analytics: – Cluster 2.
Case Study 2: University of Technology- Sydney, Australia (UTS)
Vision: To become a world leading university of technology. In 2011 they set up a project to become a “data intensive university (DIU)”.
There operational definition of DIU is as follows:
A university where staff and students understand data and, regardless of its volume ad diversity, can use and reuse it, store and curate it, apply and develop the analytical tools to interpret it.
Step 1: Define a clear set of overarching Policy Objectives
To use Learning analytics to improve student learning and to improve the student experience of university. All stakeholders have the capacity to understand and interpret data-rich environments.
Step 2: Map the context
The project was led by the senior executive of the university, the Deputy Vice-Chancellor and the Vice President (Teaching & Learning). This initially acquired pilot funding and later secure ongoing funding which allowed the creation of a dedicated Connected Intelligence Centre plus the recruitment of a renowned Learning Analytics professor.
Step 3: Identify the Key Stakeholders
A working party was set up, made up of the Deputy Vice Chancellor (T&L), Deputy Vice Chancellor (research), Deputy Vice Chancellor (Corporate Services), representatives from the library service, each faculty and each administrative area.
Step 4: Identifying Learning Analytics Purpose
Learning analytics are used or will be used to:
- provide information that can be used to decrease student attrition.
- provide a more detailed understanding of the factors affecting low pass rates in subjects with very high failure rates over time.
- provide students with more information about their own study and engagement patterns through a personalised dash board.
- enable more fine-grained understanding of the influence of a range of possible interventions on pass rates and completions
- provide valuable input to learning future projects encompassing personalisation of learning through adaption and intervention.
(I’ve listed these because they are clear and concise and they fit in with my vision of learning analytics and how they could be used within my institution).
Step 5: Develop a Strategy
Ensure engagement and buy-in from key stakeholders. Invest in pilot projects, infrastructure, expertise, and provide leadership and engage institutional leaders.
Step 6: Analyse Capacity, Develop Human Resources
Ensure that analytics stakeholders have the capacity to understand data, make judgements about its meaning and engage in evidence-based decision making. This was trialled twice with students before being made compulsory.
Step 7: Develop a Monitoring and Learning System (Evaluation)
Within the early pilots, analytic techniques identified students considered most at risk and they received a telephone call. Killer subjects (poor performance data) project identified areas that have been redesigned. A systematic strategic planning approach helped the success of the project and the integration of the analytics into the institutional culture.
Colvin et al. (2015), Student retention and learning analytics
This paper discussed two clusters – cluster 1 (student retention) and cluster 2 (understanding teaching and learning processes).
Vision: Learning analytics will be used to support our pursuit of understanding, with the emphasis on learning, and recognition that retention is consequential to broader teaching, learning and engagement experiences for students.
Step 1: Developing a clear set of overarching policy objectives
See vision. Cluster 2 were also developing organisational technical readiness – extending their technology strategy beyond their data warehouse. In comparison with cluster 1, cluster 2 saw retention rates increasing due to teaching, learning and student experience.
Step 2: Map the Context
Cluster 2 institutions had significant senior leadership input and they engaged with sponsorship at Vice Chancellor and Deputy Vice Chancellor levels.
Step 3: Identify the Key Stakeholders
Cluster 2 consisted of 17 institutions.
Step 4: Identifying Learning Analytics purpose
The goal of learning analytics was to develop insight into improving student learning outcomes.
Step 5: Develop a Strategy
To develop organisational technical readiness though consistent communication, HR and development processes through reflection of vendor tools and products and being more aware of constraints and limitations of learning analytics.
Step 6: Analyse capacity, Develop HR
9 out of 17 institutions had strategic plan for capacity with a further 7 institutions recognising the need. 11 out of 17 had extended their technology strategy to ensure they ready for commencement of implementation initiatives. Development of leaders committed to learning analytics was important.
Step 7: Develop a monitoring and learning system (evaluation)
I’ve re-read this several times but can only come up with my belief that this project was concerned with the concept, readiness and implementation stages and as such there was no evidence of a monitoring or learning system within the paper. This I believe follows on from the implantation stage but would imagine that monitoring and evaluation methods would be built into the action plan at outset.
I feel that there are aspects within each of the selected two implementations of learning analytics that fit in with my view of how learning analytics should be used. Case study 2 holistically fits more with my view, however I think the readiness for implementation through reflecting on vendor tools and products would play a major part in any pilot projects prior to rolling out LA across the institution.
Colvin, C., Rogers, T., Wade, A., Dawson, S., Gasevic, D., Buckingham Shum, S., Nelson, K., Alexander, S., Lockyer, L., Kennedy, G., Corri, L. and Fisher, J. (2015), Student Retention and Learning Analytics: A Snapshot of Australian Practices and a Framework for Advancement: Final Report 2016, Australian Government: Office for Learning and Teaching; also available online at http://he-analytics.com/ (accessed 30 July 2016).
Ferguson, R., Macfadyen, L.P., Clow, D., Tynan, B., Alexander, S., and Dawson, S. (2015) ‘Setting learning analytics in context: overcoming the barriers to large-scale adoption’, Journal of Learning Analytics, vol. 1, no. 3, pp. 120–44; also available online at http://oro.open.ac.uk/ 42115/ (accessed 19 December 2015).