Block 4 Activity 25:Evaluating Frameworks (2)

Comparison with SNAPP definition in Bakharia et al. (2009) and Quality indicators for learning analytics in Scheffel et al. (2014).

Objectives

Awareness, reflection, motivation and behavioural changes of students and educators during the learning process – educational aims.

Awareness Reflection Motivation Behavioural Change
learners and teachers reflect on contributions within a forum and/or with materials. Shared learning approach.

 

Increased participation.

 

Shared learning.

 

Social

Encourages students to participate within forum and/or course materials.

Includes behavioural changes within teachers as they approach the interventions needed by individual students.

 

Learning Support

Support for students and teachers during the learning process (while using the LA tools)

Perceived Usefulness Recommendation Activity Classification Detection of students at risk
Identifies user interaction within social network.

 

Can visualise contributions made by students.

Teachers can approach students to offer interventions where needed to ensure completion of study/activities.

Students can identify issues with resources.

Shows all user interactions within social network forum Identifies students who make little or no contribution to discussions thus flagging up possible at risk students for interventions.

 

Learning Measures and Output

Results at the end of learning process – issue of output consequence, performance, outcome – not to deal with individual student performance eg. grades but refers to the LA tools’ result and outcomes.

Comparability Effectiveness Efficiency Helpfulness
Educators can consider if the LA tool has worked better than previous analytical method/tool Did using the SNAPP data result in higher student participation within forum Did the SNAPP analytics tool provide the required data to meet the needs of the learners, educators and institution? Was the SNAPP data helpful for users and did it provide positive outcomes?

 

Data Aspect

Anything related to data, algorithms, transparency and privacy.

Transparency Data Standards Data Ownership Privacy
SNAPP makes data visible for students and teachers. Tool metrics are numerous.

Raw data.

 

SNAPP data can be exported. From LMS.

 

Organisation Aspects

Indicators of organisational issues important to ensure acceptance and uptake of LA.

 

Availability Implementation Training of Educational Stakeholders Organisational Change
Costs? Does our IT support it? SNAPP has simple installation process and usage.

Staff will require training.

IT will be required to install and support users.

SNAPP could contribute to changes in data collection and usage and make data more accessible and up-to-date.

Could contribute to raising standards in teaching and learning.

Has options for upgrading to meet future needs of organisation and with advances in technology.

 

Compare this with the evaluation of SNAPP using ‘A framework of characteristics for analytics’ Cooper (2012) in part one of this activity.

I think that the Quality Indicators for Learning Analytics Framework would be time consuming but would provide you with the reassurance of quality inbuilt into your design.  Cooper’s (2012) framework of characteristics for analytics makes it clearer who the beneficiary is, and the underlying theories that the LA support.  I may suggest adding a section within the Data Aspect to highlight the analysis subject, client and object as I feel this would help user buy-in across all levels. Technical support could be included in the Learning Support frame.

I would probably try this out first before suggesting further amendments.  SNAPP comparison didn’t add to all the boxes in the QI Framework, and this may be the case with further comparisons of learning analytic tools.

References:

Bakharia, A., Heathcote, E. and Dawson, S. (2009) ‘Social networks adapting pedagogical practice: SNAPP’ in Atkinson, R.J. and McBeath, C. (eds) Same Places, Different Spaces, Proceedings ascilite 2009, 26th Annual ascilite International Conference, Auckland, 6–9 December 2009, Auckland, The University of Auckland, Auckland University of Technology, and Australasian Society for Computers in Learning in Tertiary Education (ascilite); also available online at http://www.ascilite.org/conference/auckland09/procs/ (accessed on 31 July 2016).

Cooper, A. (2012) ‘A framework of characteristics for analytics’, CETIS Analytics Series, vol. 1, no. 7, Bolton, JISC CETIS; also available online at http://publications.cetis.ac.uk/ wp-content/ uploads/ 2012/ 12/ A-Framework-of-Characteristics-for-Analytics-Vol1-No7.pdf (accessed 31 July 2016).

Scheffel, M., Drachsler, H., Stoyanov, S. and Specht, M. (2014) ‘Quality indicators for learning analytics’, Educational Technology & Society, vol. 17, no. 4, pp. 117–32; also available online at www.ifets.info/ journals/ 17_4/ 8.pdf (accessed 31 July 2016).

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s