Obj.1 – Identify the kind of labs that need to be designed and implemented and the learning parameters and services that should be personalized through the analysis of the retained data logs.
Activities: The capabilities of game analytics will have to be matched against varied user needs, seeking different levels of access, flexibility, amount and depth of information. The activities of this objective aim at identifying the needs, motivations, constraints, tasks and goals of all stakeholders, which will specify the requirements and the desired services of a virtual lab.
The outcome of this process will be a series of educational scenarios that will be implemented for the developed learning systems. They will have to cover the full range of stakeholders and provide concrete requirements for the design of the virtual labs as well as the parameters of gaming mechanics to be optimized.
Obj.2 – Monitor the activities of users and model their learning behavior by deploying shallow game analytics methods.
Activities: Equip virtual labs with a tracking infrastructure to collect and aggregate data representing students’ activity that can be used to update their profiles and model their interaction behavior. Establish groups of learners with similar profiles and get insights about their learning style, pace, preferences, success, etc. Towards this end, identify the appropriate applied game analytics methods that could fit perfectly to learning and make the transfer of such methods to virtual labs.
Shallow game analytics have demonstrated tremendous accuracy in games and therefore it is anticipated that they can be equally effective when transferred to learning. This transfer will be achieved by designating and using a set of learning-biased metrics for assessing the collected user data and thus profiling learners. The completeness of the generated learners’ profiles will be examined against a range of learning scenarios occurring within diverse disciplines. Finally, ENVISAGE will examine whether the generated profiles are of sufficient quality to allow accurate prediction of the future behavior of learners (cf. Obj.3).
Obj.3 – Enable the prediction of the future behavior of learners by deploying deep game analytics methods.
Activities: Predict the future trajectories of the several key performance indices (KPI’s) measured from the users by exploiting machine learning algorithms (e.g., classification and regression) that have been used in games. This will enable the appropriate adjustment of the learning styles and techniques and facilitate proactive decision-making in a personalized or a segment-based manner. For instance, predictions about the future behavior of learners could be communicated to the teacher when designing new labs so as to avoid labs leading to undesired situations (e.g., frustration of students).
The learning scenarios defined in Obj.1 should clearly demonstrate the predictive power of the developed algorithms and the added value of adopting deep analytics methods from the gaming industry. In contrast to shallow analytics, which come with very high accuracy, a success measure for deep analytics will be the robustness to rapid and unpredictable behavior changes that take place during the formative years of students.
Activities: Taking into account the requirements gathered from Obj.1 and relying on recently grown open-source technologies (e.g., WebGL, Game-engines, etc.), develop an authoring environment suitable for designing and implementing virtual labs. In this direction, exploit the outcome of projects like DigiArt 1 and RAGE 2 , which develop ecosystems of game designing tools, to incorporate the resulting assets into the authoring tool in order to accelerate the development process and commercialize the final product. The resulting authoring environment should integrate the shallow and deep analytics technologies developed in Obj.2 and Obj.3 respectively, in order to allow for designing improved virtual labs that fulfill the stakeholder requirements as these have been identified in Obj.1.
The authoring tool must operate within a friendly graphical user interface and require minimal programming skills by avoiding low-level commands, which will make it intractable to teachers/tutors uninitiated to technology. The work can be based on the “Story-making engine” for authoring virtual games for cultural sites that CERTH is already developing for the project DigiArt. Moreover, it should allow for designing enticing virtual labs that make users feel comfortable boosting the immersion factor by fading out the difference with real labs and by offering personalized learning services. Towards this end, the effective integration of the analytics tools into the authoring environment will be a success measure. Finally, compatibility with the RAGE ecosystem will be an important characteristic of the developed authoring tool for facilitating its exploitation.
Obj.5 – Relying on an iterative A/B testing approach, inform teachers through a reporting system on the decision-making process for improving the design of virtual labs.
Activities: Based on both shallow and deep analytics obtained from Obj.2 and Obj.3, pinpoint and report diverse data-driven metrics logged by the engagement of learners with the system, so that teachers (i.e., lab designers) that use the authoring environment (Obj.4) get informed about the strong or weak points of a lab as well as the differences between two versions of a lab and proceed to pro-active decisions so as to build new labs that fulfill users’ requirements or improve existing ones.
The statistical significance of an A/B test will be validated using hypothesis tests along with diverse measures of chance likelihood, e.g., t-test, X-squared test, etc. As being faithful merely to quantitative results may prove to be hazardous in the decision-making process, it is important to also keep an eye to qualitative aspects of an A/B test, like for instance the context of a trial, e.g. surrounding tasks of learners, in order to avoid blind interpretation of the underlying statistics. The decrease of the churn out rate as well as the increase of the engagement of learners with a virtual lab will be the tangible measure of the effectiveness of an iterative A/B test.
Obj.6 – Equip virtual labs with tools that perform Dynamic Difficulty Adjustment (DDA) and semi-automatic adaptation of the learning parameters according to personal requirements of the learners.
Activities: Again, based on the previous analytics measures (Obj.2 and Obj.3), provide the learning content to each specific student according to his/her personal needs, preferences and performance, but with the teacher acting as the curator of the delivered content. For instance, adjust the difficulty of a sequence of questions based on the performance of the student monitored through shallow analytics. Similarly, add or remove parts of learning content based on the predicted future performance obtained through deep analytics. Moreover, deploy personalization techniques to stimulate the engagement of students to the learning process and avoid churn out by providing students with responsive assessment in the form of constructive feedback (e.g., praise, correction, comments, etc.). This can be used for the learners as an extrinsic motivator to stay in lab, e.g., by observing their progress or comparing it with that of other learners.
The produced system must integrate the results of the analytics tools from Obj.2 and Obj.3 in order to offer personalized feedback as well as to assess the responses of the users in order to streamline the process of designing and implementing virtual labs. Moreover, feedback should definitely not discourage students but at the same moment should be challenging enough to keep the engagement level of students high. Since learning in formative years of children is a very sensitive social activity, which determines key features of their character and mindset, it is important that analytics are restricted to a supportive role so as the tutor has the main responsibility for the provided content. For this purpose it is crucial to find the golden ratio between the contribution of analytics and teacher in the learning process.
Activities: Conduct a couple of small-scale pilots in order to evaluate the ENVISAGE technologies as well as the produced virtual labs, thus facilitating their improvement. More specifically, based on several realistic educational scenarios (Obj.1), employ the authoring tool along with the data and visual analytics tools for building a set of virtual labs. Following an evaluation protocol, provide the produced labs to a number of subjects (i.e., students) in order to run them and based on the results of this process evaluate the labs with respect to their usefulness and effectiveness in satisfying the educational goals of the teachers as well as their ability to adapt to the personal needs of the students. In parallel, evaluate the usability of the authoring tool in building the above virtual labs as well as the support offered by the analytics tools towards the improvement of these labs.
Detailed evaluation reports will be the measurable outcome of this objective that will assess all the involved modules and the produced virtual labs in terms of effectiveness, efficiency, friendliness, usability, etc. The evaluation of the entire end product of the project will be based on the several scenarios defined in Obj.1 and will focus on the benefit of our use case partner as well as the extent to which the developed technologies will be absorbed by educational organizations external to our project and the sustainability of the ENVISAGE product after the end of the project.