Can learner data help inform your teaching? It can, but there a few things to consider when planning your learning analytics strategy.
Learning is a complex process that takes place on a scale from milliseconds to years, as suggested by Reimann. Now educators can use learning analytics to help build a bigger picture of learning. Learners constantly produce data as they use higher education technologies and tools and this might also inform their progress and completion. And educators might also find the data from these online tools helpful for evaluation.
The field of Learning Analytics (LA) has been around for a while now but is still an imperfect art. (See Penetrating the Fog: Analytics in Learning and Education, for example.) Yet there are still several potential hurdles for academics.
Below are some issues I have encountered while evaluating learner data for a large-scale educational research project.
1. Licensing and Access
Learning management systems may keep track of student logins and interaction with the site. But not all access is equal. Can you access the data? Platforms such as Vimeo and YouTube also measure engagement. Do you or other staff need access or a license? Will another collaborator be able to provide the data and if so to what degree and in what timeframe?
2. Analysing Data
What do the analytics produced by a platform mean? Engagement data may not always indicate that learning has occured (Macfadyen and Dawson, 2010). What further analysis and interpretation is needed for meaningful, pragmatic data?
3. Utility
Does the data make sense? How does it relate to what you want to evaluate? Is it just a fancy graph? Or could it be useful to inform course design?
4. Processing
How will you get the data from the tool or platform? It probably needs processing and ‘cleaning‘ to be useful. How will you collate it so it’s easy to interpret?
5. Ethics
Ethics and privacy matter. Formal approval processes apply in research, and ethics committees can advise. For evaluative purposes, think about ethics and student agency in the planning stages. Drachsler and Greller (2016) have developed a checklist for trusted learning analytics, for example.
6. Team work
If you are lucky enough to be working in a team, who will collate the analytics? Will different parts of the process be handled by different members of the team? Understanding these issues will help with planning practical data collection strategies.
7. Duration
Maybe you want to compare data across different semesters. How will this help inform future developments? How do the different contexts of each semester influence sense-making of analytics? What duration is best for comparing data?
8. Triangulating data
Digital traces of student data are interesting to compare with other evaluative data. Think qualitative data. Interviews and focus groups, student surveys, complement and give a holistic understanding.
The future
Big data research in Artificial Intelligence and Blockchain has the potential to further our understanding for teaching and learning purposes. Our understanding of data needs to evolve too.
References
Drachsler, H., & Greller, W. (2016). Privacy and analytics: it’s a DELICATE issue a checklist for trusted learning analytics. Proceedings of the Sixth International Conference on Learning Analytics & Knowledge, 89–98. https://doi.org/10.1145/2883851.2883893
Macfadyen, L.P. & Dawson, S. 2010. Mining LMS data to develop an “early warning system” for educators: A proof of concept. Computers and Education, 54: 588–599. DOI: https://doi.org/10.1016/j.compedu.2009.09.008Griffiths, D. (2020). The Ethical Issues of Learning Analytics in Their Historical Context. In D. Burgos (Ed.), Radical Solutions and Open Science (pp. 39–55). Springer Singapore. https://doi.org/10.1007/978-981-15-4276-3_3
Taylor, M. & Vallis, C. (2021). The dual lens of analytics usage to inform learning: It’s all about evaluation. HERDSA Conference 2021. Brisbane Convention and Exhibition Centre, Brisbane, Australia, 7 – 10 July 2021.
Banner photo by Mathew Schwartz on Unsplash