Difference between revisions of "Gestures in Videogames"
From AIRWiki
m |
m |
||
Line 4: | Line 4: | ||
|coordinator=AndreaBonarini | |coordinator=AndreaBonarini | ||
|tutor=MatteoMatteucci; MaurizioGarbarino; | |tutor=MatteoMatteucci; MaurizioGarbarino; | ||
− | |students= | + | |students=GiorgioPrini; |
|resarea=Affective Computing | |resarea=Affective Computing | ||
− | |restopic=Affective Computing And BioSignals; | + | |restopic=Affective Computing And BioSignals; |
|start=2010/09/10 | |start=2010/09/10 | ||
|end=2011/03/30 | |end=2011/03/30 |
Revision as of 09:11, 20 September 2010
Gestures in Videogames
| |
Short Description: | Analysis of gestures and facial expressions of people involved in playing a videogame (TORCS) |
Coordinator: | AndreaBonarini (andrea.bonarini@polimi.it) |
Tutor: | MatteoMatteucci (matteo.matteucci@polimi.it), MaurizioGarbarino (garbarino@elet.polimi.it) |
Collaborator: | |
Students: | GiorgioPrini (giorgio.prini@mail.polimi.it) |
Research Area: | Affective Computing |
Research Topic: | Affective Computing And BioSignals |
Start: | 2010/09/10 |
End: | 2011/03/30 |
Status: | Active |
Level: | Ms |
Type: | Thesis |
This project, belonging to the Affective VideoGames research line, is aimed at building a model relating facial expressions, gestures, and movements of people playing the vidogame TORCS to their preferences among different game setting. The final aim is to detect from images taken from a camera whether people is enjoying the game experience.