Learning Analytics

Learning Analytics

What is learning analytics?

Learning Analytics refers to the specific adaptation of social analytics tools to enhance teaching and learning.

Learning analytics is a emergent area of interest in educational technology ventures.
The Horizon report (2011) forecasts learning analytics as a long-term trend that will be prevalent in education in 4-5 years. EDUCAUSE and the Bill and Melinda Gates Foundation have targeted learning analytics as one of 5 categories for funding initiatives under their $20 million (USD) Next Generation Learning Initiative. A primary objective for EDUCAUSE and the Gates Foundation in promoting learning analytics is its potential to increase high school completion rates (Kolowich 2010).

Online learners continually leave behind digital footprints in their virtual learning environments - particularly with the increasing adoption of course/learning management systems (LMS). At its most basic level, learning analytics involves using a web analytics program, such as Google Analytics, to track students’ usage of their LMS and other digital learning objects, as one way to gauge learner engagement. These analytics are built into some Learning management systems including Vista, used at UBC. Educators can use this data to:
  • help them make realtime decisions on how they might modify their course to better suit learners.
  • Identify potential ‘at-risk’ students who may need an intervention in order to avoid failing a course module or an entire course.


At this macro level, administrators at school and district levels use learning analytics to gauge students’ performance, and to compare how schools are performing vis a vis each other. These learning analytics programs, such as Almalogic (a Vancouver-based venture that UBC works with) and Global Scholar's Pinnacle Insight, also help administrators to plan interventions if a school is perceived to “underperform” in one of more areas. This is especially relevant - and controversial - in the US in the wake of the passage the No Child Left Behind act (2002), which measures schools’ overall performance through students’ performance on state standardized tests.


A core concern about learning analytics is whether using learning analytics as a primary measure of a school’s performance focuses too heavily on quantifiable performance indicators - particularly test scores - at the expense of other forms of performance assessment. 

Just as newer social media analytics programs gauge an organization’s activities and influence within social media, learning analytics is increasingly turning to the quantitative analysis of social networks in digital learning environments. SNAPP (Social Networks Advancing Pedagogical Practice) is one university-based learning analytics program (developed at the University of Wollongong in Australia) that analyses the social networks that form within learning management systems. SNAPP records statistics on not only which students participate on LMS’, and how frequently, but also pays close attention to which students respond to which students’ comments and posts, emerging leaders, whose posts are frequent and elicit much discussion, and outliers, who contribute little. Snapp also provides visualizations of these social networks to instructors and course administrators.


http://research.uow.edu.au/learningnetworks/seeing/snapp/index.html

Continue to Learning Analytics and Personalized Learning