Project Communication Kit
||D6.1 presents the first version of the project’s publicity material that will be used to disseminate its goal and objectives to the wider public. This material consists of the project web-site, poster and leaflet, as well as the project’s social media accounts. In this report we present the content that has been generated for this purpose and motivate our design and content choices.
Educational scenarios and stakeholder analysis
||The aim of this document is to present the initial requirements from a group of stakeholders (teachers, teacher trainers and school advisors) and to provide a definition of the types of educational scenarios that the authoring tool should support. Analysis of the stakeholders involved in virtual labs. An updated version of the educational scenarios will be delivered on M14. More specifically we have defined the stakeholders involved in a learning situation involving virtual, online learning environments such as virtual labs, focusing on the teachers and learners but also considering other relevant groups in the design/development and learning process (teacher trainers and school advisors). We are analysing the requirements of the different groups of stakeholders in terms of a) behavioural analytics and b) online authoring environments. The stakeholder analysis was based on a workshop with 20 participants, followed by interviews and reviews of best practices with a series of online labs (presented in this document) that EA is already employing in the framework of the offered services (lessons, labs or PD activities). Based on the stakeholder we are presenting a series of educational scenarios that will serve as a pool for the prototype demonstrators. These scenarios are organised at three levels, referring to the complexity level of the tasks that are assigned to the students while using the online labs. These scenarios will feed into WP5 for specifying the virtual labs to be designed and developed using the authoring environment (WP4) and will provide the test bed for evaluating the effectiveness of the developed technologies in WP2-4 to address the stakeholder requirements.
Data structure requirements for learning analytics
||The current document first presents a general compilation of behavioural data collected from virtual learning labs in the framework of the Go-Lab project and their analysis. The comprehensive analysis of the available data provided several deeper qualitative and quantitative understandings of the user behaviour. The experience gathered from the analyses of the available data is directly transferred to the ENVISAGE project in order to guarantee that more effective, thorough and comprehensive metrics will be implemented to collect the behaviour of each user, teacher or student, during the usage of a virtual lab in science teaching and learning. In this context we discuss and propose a list of main metrics that can be collected by the analytics service, their definition and rationale of use. We also propose that the structure of the analytics data and its aggregation level to permit analysis and interpretation similar to the one that is conventionally done in the actual school environment. We also discuss the functional requirements that the virtual lab authoring environment of ENVISAGE should accommodate. These are presented mainly from an end-user perspective being a science teacher. Finally we give a short overview to describe some basic functionalities and use cases of the Go-Lab system in the framework of which we obtained the raw log data of user actions in its authoring environment. This dataset is now available to partners of ENVISAGE for study and analysis. The description of its content is given to help the application of machine learning algorithms and practices for extracting baseline and deeper information on how the environment was utilized.
Data Management Plan
||This deliverable is the Data Management Plan document, which describes the various types of data in ENVISAGE, the procedures followed to collect them and the measures that ill be taken in order to ensure that no confidential information will be leaked. Also, the storage, archiving and preservation plan of the data is also sketched, along with our plan to comply with the Open Data Initiative.
Visualization Strategies for Course Progress Reports
||This deliverable describes the initial visualization strategies developed for the ENVISAGE project and their implementation. Building from an overview of the state of the art in general Analytics and Game Analytics, it moves on to identify requirements and goals for learning analytics visualization, building from previous deliverables in the project. A number of visualization strategies are presented and their technical implementation is described.
Architecture and Interface Design
||Relying on the functional requirements gathered in T1.2, we define the architecture of the "Virtual labs authoring tool" that will be capable of integrating the functionalities developed in WP2 and WP3 into a web interface component able to communicate with a game engine
remotely in order to produce a lab. As regards the integration with WP2 and WP3, the adopted architecture should make provision for effectively injecting the necessary metrics in the virtual lab project to assess the user behavior during the experience of the virtual lab. The selection of the basic technologies will also be made. Another responsibility of this task is to analyze the functional requirements provided by WP1 in order to deliver an accurate interface design of the authoring tool suitable for the virtual labs. Mockups will be used to simulate the workflow between the supported functionalities and also act as a running exercise between the interface designers and the author-users in defining the look and feel of the authoring tool.
||The goal of D6.2 is to present ENVISAGE’s dissemination plan by specifying the dissemination
objectives, target groups, directions, instruments and impact indicators. This plan elaborates the
draft dissemination plan that has been already included in the DoA and its goal is to conclude with a
concrete list of dissemination actions and impact indicators that will allow us to assess, follow-up and
eventually correct the planned activities in terms of the reached measured impacts.
User profiling and behavioral modeling based on shallow analytics
||This deliverable takes the first steps towards determining how to calculate shallow analytics and presents a framework for determining which metrics should be used with a virtual lab or educational game. The framework is used on the wind energy lab as an example of how metrics could be applied in a specific game context. The metrics time-on-task, time-to-completion and travel-path were identified for the lab and possible calculation of these presented. The deliverable also defines the concept of learning analytics, investigates the stakeholder needs connected to learning analytics
services and the ethical implication, which should be reflected upon, when applying them.
Pilot execution plan and evaluation
||We present and discuss a pilot execution plan and evaluation methodology and protocol that will be followed to conduct a series of small-scale test implementations of virtual labs, accompanied by the authoring, analytics and visualization tools of ENVISAGE. The quantitative and qualitative feedback collection from users, both teachers and students, will be based on the herein proposed methodology and protocol of conduct. Complementary methods are to be utilized, including structured questionnaires, interviews, focus group discussions and on-site observations, according to a well-defined procedure of conduct and reporting. A schedule of activities related to piloting and evaluation during the school year
2017-2018 is also discussed.
First Version of the “Virtual Labs” authoring tool
||The “Virtual labs authoring tool” is a plugin for WordPress that allows educators to
design experiments with an easy to manipulate graphic user interface. The educators are
able with drag-n-drop functionalities to design the experiment in 3D space and allow the
learner to play and learn. The authoring tool has an interface in a web browser (web page)
that it is able to generate virtual labs in a certain game engine, i.e. Unity3D. These games can
be compiled either for Web or for desktop use. They are available through a link or they can
be downloaded to be installed in a another server or desktop. The aforementioned
procedure is achieved through game project templates that are split into pieces of code to
re-design a new game. These templates have the necessary metrics measurement
mechanisms embedded that monitor a learners’ behavior and communicate to the game
and visual analytics components as defined in WP2 and WP3.
Technology Assembly, Integration and Validation for the implementation of the authoring tool
||This is a report on the integration protocol established between the individual
components (game-visual analytics, Unity3D, Web interface), as well as on the integration
tests that have been undertaken to verify the operational capacity of each module. The goal
of this task is the technical verification of the products of the project with respect to their
expected functionality before reaching the prototype level. This task will define a group of
internal beta-testers (teachers from EA partner) for the “Virtual labs authoring tool”
ensuring that the enabling technology has reached the necessary quality level for supporting
the pilots developed in WP5. In order to ensure that the developed solution meets the
expected requirements, validation sessions will be put in place. These validation sessions will
rely on an Assembly, Integration and Validation (AIV) plan that will be written in parallel with
the development tasks. The AIV plan will specify the integration protocol such as fixing the
integration interfaces and practices between the individual technology components in the
authoring tool. The goal of this protocol is to allow for individual components to work
independently on algorithmic refinements and optimizations, while ensuring their smooth
integration with the architecture of the authoring tool. Finally, the validation tests will be
put in action by the EA partner with teachers in various sciences. In this way, we will verify that ENVISAGE technologies function to the expected performance criteria in an
environment close to that of the author-user.
Implementation of the educational scenarios and evaluation report
||The aim of this document is to present the implementation of the educational scenarios and an evaluation report for the delivered components within ENVISAGE. The deliverable reports on the results obtained during the execution of the implemented educational scenarios. The evaluation process focused on the three separate elements of the project 1) the authoring tool for building virtual labs 2) the analytics and visualizations tool for supporting the process of improving virtual labs and 3) the developed virtual labs as a means for successfully improving the learning process for teachers and students. The same three elements will also be subject for the second iteration required by the agile framework of the work package in month 21 for D5.4 (second phase).
Educational scenarios and stakeholder analysis (Update)
||This document consists an updated part of Deliverable D1.1 that presented the initial requirements from a group of stakeholders (teachers, teacher trainers and school advisors) and provided a definition of the types of educational scenarios that the ENVISAGE authoring tool should support. Based on the proposed framework the project team has selected a series of virtual labs and performed an analysis of the reactions of the stakeholders during a series of real-life pilots. In total eleven pilots were conducted for the first iteration of the ENVISAGE services. Two for each of the authoring and visualizations tools and seven for the virtual labs (including six pilot implementations in school environments with students). According to the findings of D5.2 stakeholders saw a great potential in using the system and they are hence potential users of the finished product. Improving the UI further would therefore help accommodating the user’s requests and hopefully, attract them as end-users. Encountering difficulties in the early version of an IT-system is to be expected, however the participants still expressed enjoyment about the possibility of being able to create a 3D experience for their classroom. Overall, the users evaluated the labs with positive feedback. They liked the content and believed the proposed labs should give the students a deeper understanding of the subject. The also clearly saw the learning goals of the labs. Based on the stakeholders’ feedback we are presenting a series of updated educational scenarios that will serve as prototype demonstrators for the second pilot phase of the project, emphasising the added value of the ENVISAGE concept to the design of more engaging activities that promote the development of proficiency in problem solving competence. These scenarios are presenting three different cases that according to our view will offer the opportunity to the project team to highlight the potential of the ENVISAGE concept a) the enrichment of an existing widely used virtual lab with deep analytics, b) the integration of three existing labs in a common environment, and c) the extension of an existing lab that was used during the first pilot phase to more complex real-life environment where numerous activities can take place. These scenarios (along with the requirements to be presented in D1.4) will feed into WP5 for specifying the virtual labs to be designed and developed using the authoring environment (WP4) and will provide the test bed for evaluating the effectiveness of the developed technologies in WP2-4 to address the stakeholder requirements.
Data structure and functional requirements (Update)
||The current document is an update of D1.2 – Data structure and functional requirements and as such it has a similar scope and structure. It aims to present an updated compilation of requirements and prioritized recommendations in response to the first evaluations of the authoring tool, the analytics and visualization services, and the 3D virtual labs. First the updated list of metrics which will be incorporated in the shallow and deep analytics services and related visualization tools are presented. Then the update of functional requirements of the various components (authoring tool, analytics and visualization services) in light of the evaluation results from the piloting tests the functional requirements for the combination of Chemistry labs and for an elaborated version of a multi-stage 3D virtual lab/game based on the Wind Energy Lab Finally are presented.
Preliminary Predictive Analytics and Course Adaptation Methods
||We review the state of the art in game and learning analytics and compare and integrate these findings with the insights gained from the prior deliverables in the ENVISAGE project. We identify three possible overarching approaches that can be explored in the ENVISAGE project: clustering, prediction, and simulation of students and/or their behavior. Not all of these may be attainable for ENVISAGE, but at the current stage presents themselves as options that should all be pursued and investigated. Methods for each approach are described and outlined in the deliverable and an initial framework is implemented as an on-line Deep Analytics Learning Service that the other components developed in the ENVISAGE project can interface with.
Final version of the "Virtual labs authoring tool"
||Deliverable D4.4 provides an improved version of the Authoring Tool by addressing the evaluation feedback received during Phase I Pilots, integrating the outcomes of the second development cycle, as well as introducing optimizations with respect to the
quality of the generated content and the integrity of the generated labs for each platform (desktop, web).
Updated shallow analytics and visualization strategies
||This deliverable describes the final shallow analyƟcs and visualizaƟon strategies developed for the
ENVISAGE project and their implementaƟon. Building from an overview of the state of the art in general
AnalyƟcs and Game AnalyƟcs, it moves on to idenƟfy the final requirements and goals for learning
analyƟcs and their visualizaƟon. We describe the steps followed towards determining shallow
and visual analyƟcs under the Envisage project. We also present a general framework for determining
which metrics and which visuals should be used in conjuncƟon with a virtual lab or educaƟonal
game. The framework is then directly applied on the wind energy and the chemistry lab of the project
as the final demonstrators of shallow and visual analyƟcs in those labs
Updated predictive analytics and course adaptation methods
||The initial work on deep analytics in ENVISAGE was introduced in D3.1 with the focus on unsupervised methods and approaches used in game analytics. D3.2 now presents revised requirements and updated algorithms tailored towards educational seƫngs. We provide an extended overview of “Educational Data Mining” and “AI in Education”, and we explain how exisƟng approaches fit the ENVISAGE project. We proceed by presenƟng unsupervised and supervised learning algorithms for
deep analytics within the educational context. The work on unsupervised learning extends D3.1 and presents the clustering of students in the 2D Wind Energy Lab as an application. As examples for supervised learning, we introduce the prediction of at-risk students and proficiency levels of students. After identifying at-risk or low-performing students, the next step is to intervene and to help more students to succeed. Here, one approach is to adapt course material to better fit the students’ needs. Therefore, we present approaches for dynamic content adaptation and explain how virtual labs can be adapted to personalize learning. Before presenting our conclusion, we show examples from the ENVISAGE platform and demonstrate the current capabilities of the deep analytics components.
Implementation of the educational scenarios and evaluation report (second phase)
||The aim of this document is to present the results of the implementation of the educational scenarios, conducted through a series of pilots from M14 until M22. The educational scenarios served as a testbed for the evaluation of the three ENIVISAGE components. Like the first phase, the second phase also focused on evaluating: 1) the authoring tools as a
means for building virtual labs, 2) the analytics and visualization tool for supporting the process of improving virtual labs and 3) the developed virtual labs as a means for successfully improving the learning process for teachers and students. For the second phase pilots, extra resources have been allocated to user testing the virtual labs (Wind and Chemistry lab) as this was not possible during the first phase. However, between two and three pilots were run for the authoring tool and the analytics tool, respectively, and, thus, all components of ENVISAGE are therefore adequately and successfully tested.