Difference between revisions of "Multimodal GUI for driving an autonomous wheelchair"

From AIRWiki
Jump to: navigation, search
m
 
Line 15: Line 15:
 
|level=Bs; Ms
 
|level=Bs; Ms
 
|type=Course
 
|type=Course
|status=Active
+
|status=Closed
 
}}
 
}}
 
;Tools and instruments
 
;Tools and instruments

Latest revision as of 16:45, 28 April 2011

Title: Multimodal GUI for driving an autonomous wheelchair
LURCH wheelchair.jpg

Image:LURCH_wheelchair.jpg

Description: This project pulls together different Airlab projects with the aim to drive an autonomous wheelchair (LURCH - The autonomous wheelchair) with a multi modal interface (Speech Recognition, Brain-Computer Interface, etc.), through the development of key software modules. The work will be validated with live experiments.
Tutor: MatteoMatteucci (matteo.matteucci@polimi.it), SimoneCeriani (ceriani@elet.polimi.it), DavideMigliore (d.migliore@evidence.eu.com)
Start: 2009/10/01
Students: 1 - 2
CFU: 5 - 10
Research Area: BioSignal Analysis
Research Topic: Brain-Computer Interface
Level: Bs, Ms
Type: Course
Status: Closed
Tools and instruments
C++, C, BCI2000, Matlab
Linux
EEG system
Lurch wheelchair
Bibliography
R. Blatt et al. Brain Control of a Smart Wheelchair [1]