Description of work
T1.1 – Stakeholder analysis and educational scenarios (EA) [M1-M14]
Define the stakeholders involved in a learning situation involving virtual, online learning environments such as virtual labs, focusing on the teachers and learners but also considering other relevant groups in the design/development and learning process. Analyze the requirements of both groups of stakeholders in terms of a) behavioral analytics and b) online authoring environments. The stakeholder analysis will be based on interviews, direct observation and reviews of best practices with the plethora of such learning environments that EA is already employing and their participation in international projects on virtual learning spaces. Based on the stakeholder analysis provide the educational scenarios that will be implemented as the prototype demonstrators. These scenarios will feed into WP5 by specifying an equal number of virtual labs to be designed and developed using the authoring environment (WP4) and will provide the test bed for evaluating the effectiveness of the developed technologies in WP2-4 to address the stakeholder requirements.
T1.2 – Data structure requirements for learning analytics (EA, GIO) [M1-M14]
Based on the learning environments in use at EA (notably the Open Discovery Space, Inspiring Science Education and Go-Lab), as well as current projects internationally, the current behaviors and metrics that are captured through telemetry will be investigated and defined to provide an overview of the kinds of behaviors that can be captured through user behavior (learners and teachers). Behaviors that can be captured across online virtual labs, as well as behaviors that are more specific to certain kinds of labs, will be considered. In addition, based on experiences from game analytics, a structure for event-based tracking, where the individual events can be completely customized, will be defined. Jointly, these provide the baseline for the learning analytics package (WP2) and specifically for Task 2.1 which focuses on translating the tracking points into the GIO’s back end. Guided by best practices in game analytics, it will be investigated which of the identified behaviors, and potentially additional behaviors, are useful to capture in order to address the four key game analytics technologies to be transferred to learning analytics: Profiling, prediction, A/B testing and retention, thus offering the ability to test the design of online learning situation in a virtual lab.
In parallel to the specification of the data structure requirements, as part of the activities of this task a set of behavioral telemetry data from current virtual labs will be collected, pre-processed and become available to the project so as to allow the immediate initiation of the analytics-related activities in WP2 and WP3.
T1.3 – Functional requirements in virtual learning spaces (scenario authoring environment) (EA, CERTH) [M1-M14]
This task will investigate current authoring tools in games and education to develop the requirements and specifications for the authoring tool itself across functionality, design and interface. The requirements should enable the production of the type of educational scenarios defined in T1.1, and form the initial basis for the development and implementation of the authoring tool in WP4. Moreover, this task will investigate current ways that visualization is used in learning analytics and game analytics to define best practices for how to present insights to the different stakeholders (learning analytics package). The results of this early work will be continued and implemented in T2.3.
EA will lead the activities of this work package by providing the stakeholder analysis along with the several educational scenarios. Moreover, EA will be responsible for providing ENVISAGE with behavioral data that have already been logged from existing virtual labs at the very first day of the project.
GIO and CERTH will be responsible to investigate the requirements of virtual labs in terms of data structures and functionalities involved in the process of improving virtual labs.