When you want to start building or applying learning analytics solutions, or are interested in doing research in the area, you should first get a good overview of what can be achieved with learning analytics in the first place. Learning analytics focuses on learning and teaching activities in digital supported learning environments such as the Learning Management Systems Moodle. In general, this kind of systems collect digital traces that can be analyzed. This in turn enables learners and teachers to receive feedback and status reports about their own learning or teaching track. In this sense, learning analytics can make individual key performance indicators for learning and teaching much more transparent and easier to understand.

On the long run, few main goals of learning analytics are to build better pedagogies, empower students to take more active part during their studies, prevent students at-risk from failing a course or dropping out, and assess factors that affect students’ performance. For learners, educators, and researchers, learning analytics has already shown great examples of how it can offer insights into student progress, interaction with online courseware, and learning environments that are used for teaching. If your goal is to apply data-driven learning and assessment in practice, you should start with activities, such as looking for patterns in data, testing algorithms, or developing analytics models to understand what contributes to learning.

To give a few examples of carried out empirical research studies on learning analytics, the focus can be on

  • predicting learning behavior and output (e.g., identifying students at risk of dropping out or failing a course)
  • discovering structures and relationships among students (e.g., clustering similar learner types)
  • distilling data for human judgement (e.g., monitoring and visualizing learning situations)

Literature review

In order to offer a holistic understanding of the kind of objectives that could be set for learning analytics projects, we conducted a systemic literature review which provides an overview of objectives of applying learning analytics in education. First, we created a predefined categorization schema for identifying objectives for learning analytics from the literature. Thereafter, we assessed the objective categories and updated them during the literature review. The literature used for this classification consisted of 362 conference papers published in “Learning Analytics & Knowledge” conference, and 39 journal publications in the “Journal of Educational Measurement”.

The literature review took into account the following objectives with a purpose to identify how often each objective appeared in the research literature. In general, these results give also some guidance where the focus of learning analytics research has been in the recent years.

During the literature review we also identified one more inclusive objective often used in learning analytics. This more encompassing objective was to understand students learning paths/ways. For example, when the learning paths are made transparent, it is easier to sequence learning activities or to improve a curriculum. Moreover, automatic assessment or digitally supported assessing students work was also identified as one of the learning analytics objectives, even though it was not used as an own category for this literature review.

Objectives

  • Monitoring student performance and reflecting current status: The objectives are to track students’ activities and accomplishments, for example, in order to support current or the future design of the learning activity. This type of monitoring also enables continuous evaluation of the learning process with the purpose of being able to make improvements.
  • Distillation of data for human judgement (e.g., preparing dashboards and visualizations): When applying learning analytics in practice, normally the first objective is something other than just making graphical visualizations of data (e.g., classifying students into certain groups, predicting students’ performance), and the second objective is to visualize the results. Therefore, this objective is most often included, but it has not been the main research focus in the literature.
  • Predicting student performance: The goal is to develop a model that attempts to predict learners’ knowledge or future performance based on her or his past data about activities and accomplishments. This predictive model can then, for example, be used to provide proactive intervention for students who may need additional assistance, or to help suggesting future actions that can improve learners’ performance.
  • Grouping, classifying, and clustering students: This objective focuses on identifying and grouping students based on similar data patterns. In the same manner than customer segmentation can deepen the understanding of the existing customer base, segmenting students can enable to find meaningful patterns among groups of students with similar characteristics to better support their needs.
  • Improving physical learning environment: This is also an objective of learning analytics, with a goal to improve the place of learning. Factors such as size and structure of the class room, essential equipment needed for certain learning purposes, improving the air quality, or adapting the temperature affect students’ learning.
  • Discovering and understanding student-network structures (e.g., applying social network analysis): Social network analyses can, for example, help to understand relations, roles and network characteristics among students. This is yet again related to finding patterns, but the focus here is on discovering network patterns that can support the educators in their teaching activities, or students in their learning activities.
  • Skill and competencies estimation: This goal is pretty straight forward, and it refers to estimating the current skills and competencies of the students. This enables, for example, adjusting learning activities to suit the student’s skills and competencies.
  • Detecting and understanding learner’s cognitive states and behavior (e.g., tracking motions, student’s attention, and movements with sensors or video capture): This objective refers to detecting patterns in learners’ affective states, engagement, or attention level. On a higher level, the goal is to understand in what state do the students actually learn the best. For example, certain type of sensor or video data could suggest when the attention levels have dropped, which would be a good indicator for having a break or activating the students in some other way. Analyzing this type of data often requires more computing power due to its big size, and big part of the gathered data is meaningless (i.e., no significant changes in the data are identifiable).
  • Implementing adaptive learning (e.g., systems that adapt according to a certain skill level): The goal here is to be able implement a system that automatically modifies learners’ learning path, by adaptively organizing learning resources and instructional activities according to the needs or skills of the individual learner. Therefore, this could also be a follow-up objective after being able to estimate the learner’s skills and competencies. This kind of systems are based on a set of pre-defined rules.
  • Predicting retention and attrition risk detection (e.g., for Intelligent tutoring, adaptation, and intervention): By analyzing the student’s behavior and performance, risk of students dropping out from courses can be detected and certain measures can be implemented to retain these students. For example, these measures can be to automatically notify the students about being at risk of dropping out or not passing a course. Alternatively, also tutors could intervene and offer support to the students, and point out ways to mitigate the risk.
  • Sequencing courses and constructing courseware: This goal aims at understanding different learning paths in order to optimize or suggest relevant courses or courseware for students based on their interests or past performance. For example, if a student has finished courses “statistics I” and “data management”, the next relevant course could be “predictive analytics”. However, “statistics II” and “Python programming” might also be relevant courses to take before enrolling to “predictive analytics”, as these provide some supportive knowledge that has led students to better performance in the past. Another example would be to have a similar approach than Netflix for sequencing courses and choosing courseware based on certain interests, which would automatically offer recommendations (e.g., “students interested in these topics have chosen these courses”).
  • Intelligent feedback (e.g., sending reminders or suggestions for content improvement): The idea behind intelligent feedback is to provide an automatic response to students input in a given context based on past data, which can, for example, be used to support (self-)assessment and to improve performance. Intelligent feedback can refer to algorithms or systems that are developed for certain purposes or situations. For example, such algorithms or systems could be used for reminding students about a certain activity, deadline, or to improve efficiency when writing an essay.
  • Understanding student learning paths /ways
  • Assessing student work

Figure: Categorization schema with frequency of different objective categories in learning analytics literature (includes categorization of all the 401 papers).

The categorization used for both Methods and Datasets is based on the following sources:

Baker, R., Siemens, G.: Educational data mining and learning analytics. In: Sawyer, R.K. (ed.) The Cambridge Handbook of the Learning Sciences, 2nd edn, pp. 253–274. Cambridge University Press, Cambridge (2015)

http://www.columbia.edu/~rsb2162/BakerSiemensHandbook2013.pdf
Nicolae Nistor, Michael Derntl, Ralf Klamma: Learning Analytics: Trends and Issues of the Empirical Research of the years 2001-2014. EC-TEL 2015 Toledo, Spain

https://de.slideshare.net/klamma/learning-analytics-trends-and-issues-of-the-empirical-research-of-the-years-20112014
Chatti, M. A., Dyckhoff, A. L., Schroeder, U., Thüs, H. (2012). A reference model for learning analytics. International Journal of Technology Enhanced Learning, 4(5-6), 318-331.
Papamitsiou, Z. K., Economides, A. A. (2014). Learning analytics and educational data mining in practice: A systematic literature review of empirical evidence. Educational Technology Society, 17(4), 49-64.
Sin, K., Muthu, L. (2015). Application of big data in education data mining and learning analytics—A lterature review. ICTACT journal on soft computing, 5(4), 1035-1049.
Greller, W., Drachsler, H. (2012). Translating learning into numbers: A generic framework for learning analytics. Educational technology society, 15(3), 42-57.