Multimodal GUI for driving an autonomous wheelchair

From AIRWiki
Revision as of 01:18, 16 October 2009 by MatteoMatteucci (Talk | contribs)

Jump to: navigation, search
Title: Multimodal GUI for driving an autonomous wheelchair
LURCH wheelchair.jpg

Image:LURCH_wheelchair.jpg

Description: This project pulls together different Airlab projects with the aim to drive an autonomous wheelchair (LURCH - The autonomous wheelchair) with a multi modal interface (Speech Recognition, Brain-Computer Interface, etc.), through the development of key software modules. The work will be validated with live experiments.
Tutor: MatteoMatteucci (matteo.matteucci@polimi.it), BernardoDalSeno (bernardo.dalseno@polimi.it), SimoneCeriani (ceriani@elet.polimi.it), DavideMigliore (d.migliore@evidence.eu.com)
Start: 2009/10/01
Students: 1 - 2
CFU: 5 - 10
Research Area: BioSignal Analysis
Research Topic: Brain-Computer Interface
Level: Bs, Ms
Type: Course
Status: Active
Tools and instruments
C++, C, BCI2000, Matlab
Linux
EEG system
Lurch wheelchair
Bibliography
R. Blatt et al. Brain Control of a Smart Wheelchair [1]