PrjCFUMax
|
20 +
|
PrjCFUMin
|
10 +
|
PrjDescription
|
Simultaneous Localization and Mapping (SLA … Simultaneous Localization and Mapping (SLAM) is one of the basic functionalities required from an autonomous robot. In the past we have developed a framework for building SLAM algorithm based on the use of the Extended Kalman Filter and vision sensors. A recently available vision sensor which has tremendous potential for autonomous robots is the Microsoft Kinect RGB-D sensor. The thesis aims at the integration of the Kinect sensor in the framework developed for the development of a point cloud base system for SLAM.
'''Material:'''
*Kinect sensor and libraries
*A framework for multisensor SLAM
*PCL2.0 library for dealing with point clouds
'''Expected outcome:'''
*Algorithm able to build 3D point cloud representation of the observed scene
*Point clouds processing could be used to improve the accuracy of the filter as well
'''Required skills or skills to be acquired:'''
*Basic background in computer vision
*Basic background in Kalman filtering
*C++ programming under Linux man filtering
*C++ programming under Linux
|
PrjImage
|
Image:PointCloudKinect.jpg +
|
PrjLevel
|
Master of Science +
|
PrjResArea
|
Computer Vision and Image Analysis +
|
PrjResTopic
|
None +
|
PrjStarts
|
1 January 2015 +
|
PrjStatus
|
Active +
|
PrjStudMax
|
2 +
|
PrjStudMin
|
1 +
|
PrjTitle
|
Point cloud SLAM with Microsoft Kinect +
|
PrjTutor
|
User:MatteoMatteucci +
|
PrjType
|
Thesis +
|
Categories |
ProjectProposal +
|
Modification dateThis property is a special property in this wiki.
|
31 December 2014 15:32:32 +
|