Difference between revisions of "Multimodal GUI for driving an autonomous wheelchair"
From AIRWiki
(New page: {{ProjectProposal |title=Multimodal GUI for driving an autonomous wheelchair |image=LURCH_wheelchair.jpg |description=This project pulls together different Airlab projects with the aim to ...) |
(No difference)
|
Revision as of 00:47, 14 October 2009
Title: | Multimodal GUI for driving an autonomous wheelchair |
Image:LURCH_wheelchair.jpg |
Description: | This project pulls together different Airlab projects with the aim to drive an autonomous wheelchair (LURCH - The autonomous wheelchair) with a multi modal interface (Speech Recognition, Brain-Computer Interface, etc.), through the development of key software modules. The work will be validated with live experiments.
| |
Tutor: | MatteoMatteucci (matteo.matteucci@polimi.it), BernardoDalSeno (bernardo.dalseno@polimi.it), SimoneCeriani (ceriani@elet.polimi.it), DavideMigliore (d.migliore@evidence.eu.com) | |
Start: | 2009/10/01 | |
Students: | 1 - 2 | |
CFU: | 5 - 10 | |
Research Area: | BioSignal Analysis | |
Research Topic: | Brain-Computer Interface | |
Level: | Bs, Ms | |
Type: | Course | |
Status: | Active |
- Tools and instruments
- C++, C, BCI2000, Matlab
- Linux
- EEG system
- Lurch wheelchair
- Bibliography
- R. Blatt et al. Brain Control of a Smart Wheelchair [1]