Difference between revisions of "PetDrone"

From AIRWiki
Jump to: navigation, search
m (Motivation)
m (Final result)
 
(15 intermediate revisions by 2 users not shown)
Line 8: Line 8:
 
  | restopic=Robot development
 
  | restopic=Robot development
 
  | start=4/04/2013
 
  | start=4/04/2013
  | end=
+
  | end= 14/06/2013
  | status=Active
+
  | status=Closed
 
  | level = Bs
 
  | level = Bs
 
}}
 
}}
  
 
== Goal ==
 
== Goal ==
The goal of this project is to develop a robot pet, which means letting the drone behave like a pet in some aspects.  
+
The goal of this project is to develop a robotic pet, which means letting the drone behave like a pet in some aspects.
  
 
== Motivation ==
 
== Motivation ==
Create a interactive Robogame based on AR.Drone.
+
Create a interactive Robot based on AR.Drone.
  
 
== Useful readings ==
 
== Useful readings ==
 
=== OpenCV ===
 
* Object Detection[http://docs.opencv.org/doc/tutorials/objdetect/table_of_content_objdetect/table_of_content_objdetect.html#table-of-content-objdetect]
 
  
 
=== ROS ===
 
=== ROS ===
 
* ardrone driver [http://ros.org/wiki/ardrone_autonomy]
 
* ardrone driver [http://ros.org/wiki/ardrone_autonomy]
 +
 +
* gui controller, drone controller, and drone state estimation  [http://www.ros.org/wiki/tum_ardrone]
 +
 +
* Interfacing ROS with OpenCV  [http://www.ros.org/wiki/vision_opencv]
 +
 +
=== Object Detection ===
 +
*OpenCV, Feature Detection(SIFT/SURF)[http://opencv.willowgarage.com/documentation/cpp/feature_detection.html]
 +
 +
* OpenCV, Object Detection[http://docs.opencv.org/doc/tutorials/objdetect/table_of_content_objdetect/table_of_content_objdetect.html#table-of-content-objdetect]
 +
 +
*Predator algorithm, TLD(Tracking-Learning-Detection)[http://personal.ee.surrey.ac.uk/Personal/Z.Kalal/tld.html]
 +
 +
== Final result ==
 +
Based on OpenCV, ROS and TLD algorithm,  the AR.drone can detect multi objects and respond with the corresponding behaviors, as a pet. Following it's a detailed description:
 +
 +
Having learnt three objects (shoe, AIRLab trademark, and pizza box),
 +
 +
When the drone sees the shoe, it will take off.
 +
 +
When the drone sees the trademark of AIRLab it will dance.
 +
 +
When the drone sees the pizza box it will land.
 +
 +
A video is available from YouTube.
 +
 +
 +
 +
{{#ev:youtube|u5opWIsHY-Q}}
 +
 +
*[http://youtu.be/u5opWIsHY-Q External link]

Latest revision as of 13:40, 10 July 2013

PetDrone
Short Description: The goal of the project is to develop a drone able to interactive with people like a pet in some aspects.
Coordinator: AndreaBonarini (andrea.bonarini@polimi.it)
Tutor: AndreaBonarini (andrea.bonarini@polimi.it)
Collaborator:
Students: ZhizhongLi (zhizhong.li@mail.polimi.it)
Research Area: Robotics
Research Topic: Robot development
Start: 4/04/2013
End: 14/06/2013
Status: Closed
Level: Bs

Goal

The goal of this project is to develop a robotic pet, which means letting the drone behave like a pet in some aspects.

Motivation

Create a interactive Robot based on AR.Drone.

Useful readings

ROS

  • ardrone driver [1]
  • gui controller, drone controller, and drone state estimation [2]
  • Interfacing ROS with OpenCV [3]

Object Detection

  • OpenCV, Feature Detection(SIFT/SURF)[4]
  • OpenCV, Object Detection[5]
  • Predator algorithm, TLD(Tracking-Learning-Detection)[6]

Final result

Based on OpenCV, ROS and TLD algorithm, the AR.drone can detect multi objects and respond with the corresponding behaviors, as a pet. Following it's a detailed description:

Having learnt three objects (shoe, AIRLab trademark, and pizza box),

When the drone sees the shoe, it will take off.

When the drone sees the trademark of AIRLab it will dance.

When the drone sees the pizza box it will land.

A video is available from YouTube.