Difference between revisions of "FaceAnalysisInVideogames"

From AIRWiki
Jump to: navigation, search
(Created page with "{{Project |title=Face analysis in Videogames |image= |short_descr=Analysis of the face of people involved in videogames aimed at identifying the elements that could be used as...")
 
m
Line 6: Line 6:
 
|tutor=AndreaBonarini
 
|tutor=AndreaBonarini
 
|collaborator=
 
|collaborator=
|students=DavideTosetti
+
|students=DavideTosetti;
|resarea=Robotics
+
|resarea=Affective Computing
|restopic=Robot development
+
|restopic=Affective Computing And BioSignals;
 
|start=2012/04/15
 
|start=2012/04/15
 
|end=2012/09/30
 
|end=2012/09/30
Line 15: Line 15:
 
|type=Thesis
 
|type=Thesis
 
}}
 
}}
 +
This project, belonging to the Affective VideoGames research line, is aimed at building a model relating facial expressions and head movements of people playing the vidogame TORCS to their preferences among different game settings. The final aim is to detect from images taken from a camera whether people is enjoying the game experience.

Revision as of 13:27, 16 April 2012

Face analysis in Videogames
Short Description: Analysis of the face of people involved in videogames aimed at identifying the elements that could be used as cues for interest and engagement.
Coordinator: AndreaBonarini (andrea.bonarini@polimi.it)
Tutor: AndreaBonarini (andrea.bonarini@polimi.it)
Collaborator:
Students: DavideTosetti (davide.tosetti@mail.polimi.it)
Research Area: Affective Computing
Research Topic: Affective Computing And BioSignals
Start: 2012/04/15
End: 2012/09/30
Status: Active
Level: Bs
Type: Thesis

This project, belonging to the Affective VideoGames research line, is aimed at building a model relating facial expressions and head movements of people playing the vidogame TORCS to their preferences among different game settings. The final aim is to detect from images taken from a camera whether people is enjoying the game experience.