Difference between revisions of "Gestures in Videogames"
From AIRWiki
m |
m |
||
| Line 4: | Line 4: | ||
|coordinator=AndreaBonarini | |coordinator=AndreaBonarini | ||
|tutor=SimoneTognetti; MaurizioGarbarino; | |tutor=SimoneTognetti; MaurizioGarbarino; | ||
| − | |students= | + | |students=GiorgioPrini; |
|resarea=Affective Computing | |resarea=Affective Computing | ||
|restopic=Affective Computing And BioSignals; | |restopic=Affective Computing And BioSignals; | ||
| − | |start=2010/ | + | |start=2010/09/10 |
| − | |end= | + | |end=2011/3/30 |
|status=Active | |status=Active | ||
|level=Ms | |level=Ms | ||
Revision as of 09:01, 20 September 2010
Gestures in Videogames
| |
| Short Description: | Analysis of gestures and facial expressions of people involved in playing a videogame (TORCS) |
| Coordinator: | AndreaBonarini (andrea.bonarini@polimi.it) |
| Tutor: | SimoneTognetti (tognetti@elet.polimi.it), MaurizioGarbarino (garbarino@elet.polimi.it) |
| Collaborator: | |
| Students: | GiorgioPrini (giorgio.prini@mail.polimi.it) |
| Research Area: | Affective Computing |
| Research Topic: | Affective Computing And BioSignals |
| Start: | 2010/09/10 |
| End: | 2011/3/30 |
| Status: | Active |
| Level: | Ms |
| Type: | Thesis |
This project, belonging to the Affective VideoGames research line, is aimed at building a model relating facial expressions, gestures, and movements of people playing the vidogame TORCS to their preferences among different game setting. The final aim is to detect from images taken from a camera whether people is enjoying the game experience.