PORTALE DELLA DIDATTICA

Ricerca CERCA
  KEYWORD

GR-09 - GRAphics and INtelligent Systems - GRAINS

Believable avatars for virtual reality experiences in the medical domain

Reference persons FABRIZIO LAMBERTI

External reference persons Prof. Bill Kapralos
Software Informatics Research Centre (SIRC)
University of Ontario Institute of Technology
Oshawa, Ontario, Canada
http://faculty.uoit.ca/kapralos

Research Groups GR-09 - GRAphics and INtelligent Systems - GRAINS

Description Overview

Central to many interactive digital experiences are virtual characters known as avatars, digital representations of the self that enable individuals to interact within a virtual environment. Avatars can represent either the user or interactive and non-interactive non-playing characters (NPCs), which form the “inhabitants” of the virtual (simulated) environment. Avatars are becoming more prevalent not only in virtual environments, but also everyday interactions with devices such as smart phones and smart speakers. Technologies in graphics, artificial intelligence, sensors and speech synthesis have advanced, increasing the levels of fidelity enabling increasingly more natural interactions. A long-term goal of game designers has been to create realistic (human-like) avatars that enable the sharing of emotion in the virtual environment. With respect to simulations and training in the medical field, virtual patient avatars (VPs), allow for safe and repeatable exposure to various medical scenarios, providing trainees the opportunity to practice/learn and make mistakes without real-world consequences. Avatars in medical applications can provide a flexible and creative platform with which to deliver individual and group therapies, peer support, and offer significant potential to engage a large range of patients who require psychological support but may otherwise be unable or unwilling to participate in traditional treatments.

Despite advances in technologies, there are still many problems and issues that must be overcome in avatar-based health interventions—particularly those related to realism displaying and detecting emotions. For example, it was found that users found it difficult to express empathy to VPs, thus the simulation outcome was less effective when compared to a real patient-therapist relationship. Incorporating affect (that is, the expression of emotions or feelings), into avatars can help overcome these problems and lead to more believable characters, motivate learning, and promote enjoyment. In the production of virtual environments, the affect computing challenge is often to ensure that the avatars exhibit adequate human behaviour driven by cognitive intelligence, personality, and emotions, all of which create a sense of awareness, and provide an engaging user experience. Believable characters can facilitate flow, enhance haptic interfaces, motivate learning, evoke compelling narrative, provoke competition, and facilitate enjoyment. At the core, perhaps the key aspect to communication is the recognition and understanding and expression of emotions. Emotions are defined as a subjective response, accompanied by a physiological change that affects behaviour and decision making, and although the concept of emotion has been identified as a key design consideration in virtual environments including games, it has been dealt with in a haphazard and superficial rather than as a multilayered and complex psychological phenomena.

Goals

Working within an interdisciplinary team of researchers that includes experts and students from game development, computer science, engineering, and medicine, within the scope of the project, the student will devise an avatar-based affect system through the use of an eye tracker. The eye tracker will be used to determine areas of interest while performing tasks within a virtual medical environment (e.g., operating room, examination room, etc.). The avatar will be able to detect to gaze areas of interest and respond accordingly. A study comparing head mounted display orientation to eye tracking is expected to be conducted to further understand the relevance of eye tracking in VR and its importance in affect (emotions and feelings) detection.

See also  http://grains.polito.it/work.php

Required skills Good programming skills are required. Knowledge of 3D and VR tools (e.g., Blender, Unity) are recommended, but not mandatory requested. Since more than one positions are available, part of the work could be on the backend and concern activities not related to computer graphics.

Notes This proposal frames into a long-standing collaboration between the GRAINS group and the University of Ontario, Institute of Technology (UOIT), Canada and refers to activities that could be carried out either at Politecnico di Torino or at UOIT (in the latter case, the page on the Portale della Didattica regarding proposals for thesis abroad shall be checked). More than one positions are therefore available.


Deadline 28/08/2020      PROPONI LA TUA CANDIDATURA




© Politecnico di Torino
Corso Duca degli Abruzzi, 24 - 10129 Torino, ITALY
Contatti