Description of work

T2.1 – Tracking infrastructure for user data collection and aggregation (GIO) [M3-M21]

This task is dedicated to:

  • Compilation of a list of tracking points in virtual labs
  • Integration of tracking components on the client side
  • Extending the existing infrastructure to handle learning analysis data

Initially, the current practices in behavior modeling of virtual lab users have to be analyzed in order to identify suitable tracking points. These tracking points typically depend on the particular lab, however, best practices can be worked out to speed up the integration process. Once these tracking points have been defined, GIO’s service needs to be integrated into different learning apps or virtual labs so that a data-stream to the backend is enabled. On the backend, GIO‘s existing infrastructure has to be prepared to collect the user data from learning apps. Typically, a raw stream of event data is stored and in a pre-processing step aggregated on a user level. After aggregation this data is typically stored in an appropriate database to allow simple, as well as complex, queries. Depending on the tracking points, the data processing pipeline will be adapted to meet the requirements of learning analytics within ENVISAGE.

 

T2.2 – Learning analytics from usage data (AAU, GIO) [M4-M16]

The task is dedicated to:

  • Access the collected data
  • Enrich the data with shallow statistics

Utilizing data points that can be easily collected in virtual lab software, T2.2 will identify methods of deriving learning-based analytics from the data established in T2.1. The processes will mostly follow a quantitative process, deriving descriptive statistics of the learning outcomes and performance. Here, one starts with adapting known metrics from the field of game analytics. For example, session counts or time spent in the virtual lab. Such simple count statistics can also be computed for groups of users, e.g., by taking the mean. Such shallow statistics already give an overview which users are more or less engaged in a learning task. More sophisticated analytics in games also look at heatmaps for example. If it is possible to obtain such data in virtual labs, a spatial analysis of user data in virtual labs may also yield interesting insights which can be used to influence the design of virtual labs in the future.

 

T2.3 – Data visualization for course progress reports (UoM, GIO) [M4-M16]

The purposes of the shallow metrics collected under T2.1 and analyzed as descriptive statistics under T2.2 are threefold: (a) for the management of an educational facility to gauge the successes and failures of past virtual labs and take strategic steps towards improving the virtual learning environment and its use, (b) for the educator to monitor in-progress courses delivered through virtual labs and identify problems in past virtual lab use through historical data, (c) for the learners to track their own learning progress and compare it with that of the class or their friends. All of these purposes require that data is presented in a concise, easy to understand way which however also prompts reflection and recognition of problematic patterns (e.g. in at-risk learners). Task 2.3 will tackle the issue of presenting the raw data and learning analytics to these different stakeholders (management, educators, learners).

In Task 2.3, current trends in visualizing data for feedback to the development teams (e.g. reports of A/B tests) as well as for feedback to the end-users (e.g. through public websites which aggregate gameplay data and show current trends) will be transferred to the field of education and particularly to the visualization of learning analytics. The task will primarily identify methods used in the game industry which would be most appropriate for the stakeholders of educational facilities, especially educators. By integrating specifications from T1.3 regarding the learning outcomes and educators’ methodologies, the creation of reports which are intuitive, responsive (e.g. for real-time monitoring), and show relevant information will be devised to assist the educator’s task during the course of a virtual lab and for designing new virtual labs. Moreover, visualizations of learners’ progress will be designed, developed and tested as a feedback mechanism and intrinsic motivation for learners. Finally, concise reports (with aggregated statistics and visualizations) will be created for a high-level overview of the state in virtual labs, for the purposes of management of educational facilities.

 

Partner’s role

GIO leads the activities in this work package and is responsible for collecting and aggregating the data as well as identifying the appropriate learning-biased metrics in order to extent current analytics infrastructure towards the learning domain.

AAU will perform shallow analytics on the obtained data, which will allow for profiling and modeling the behavior of the learners.

UoM will be responsible for the effective visualization of the shallow analytics results.