Difference between revisions of "Gestures in Videogames"
From AIRWiki
m (New page: {{Project |title=Gestures in Videogames |short_descr=Analysis of gestures and facial expressions of people involved in playing a videogame (TORCS) |coordinator=AndreaBonarini |tutor=Simone...) |
|||
(5 intermediate revisions by the same user not shown) | |||
Line 3: | Line 3: | ||
|short_descr=Analysis of gestures and facial expressions of people involved in playing a videogame (TORCS) | |short_descr=Analysis of gestures and facial expressions of people involved in playing a videogame (TORCS) | ||
|coordinator=AndreaBonarini | |coordinator=AndreaBonarini | ||
− | |tutor= | + | |tutor=MatteoMatteucci; MaurizioGarbarino; |
− | |students= | + | |students=GiorgioPrini; |
|resarea=Affective Computing | |resarea=Affective Computing | ||
|restopic=Affective Computing And BioSignals; | |restopic=Affective Computing And BioSignals; | ||
− | |start=2010/ | + | |start=2010/09/10 |
− | |end= | + | |end=2011/03/30 |
− | |status= | + | |status=Closed |
|level=Ms | |level=Ms | ||
− | |type= | + | |type=Thesis |
}} | }} | ||
This project, belonging to the [[Affective VideoGames]] research line, is aimed at building a model relating facial expressions, gestures, and movements of people playing the vidogame TORCS to their preferences among different game setting. The final aim is to detect from images taken from a camera whether people is enjoying the game experience. | This project, belonging to the [[Affective VideoGames]] research line, is aimed at building a model relating facial expressions, gestures, and movements of people playing the vidogame TORCS to their preferences among different game setting. The final aim is to detect from images taken from a camera whether people is enjoying the game experience. |
Latest revision as of 16:57, 3 October 2011
Gestures in Videogames
| |
Short Description: | Analysis of gestures and facial expressions of people involved in playing a videogame (TORCS) |
Coordinator: | AndreaBonarini (andrea.bonarini@polimi.it) |
Tutor: | MatteoMatteucci (matteo.matteucci@polimi.it), MaurizioGarbarino (garbarino@elet.polimi.it) |
Collaborator: | |
Students: | GiorgioPrini (giorgio.prini@mail.polimi.it) |
Research Area: | Affective Computing |
Research Topic: | Affective Computing And BioSignals |
Start: | 2010/09/10 |
End: | 2011/03/30 |
Status: | Closed |
Level: | Ms |
Type: | Thesis |
This project, belonging to the Affective VideoGames research line, is aimed at building a model relating facial expressions, gestures, and movements of people playing the vidogame TORCS to their preferences among different game setting. The final aim is to detect from images taken from a camera whether people is enjoying the game experience.