Difference between revisions of "PetDrone"
From AIRWiki
ZhizhongLi (Talk | contribs) (→Final result) |
m (→Final result) |
||
(5 intermediate revisions by 2 users not shown) | |||
Line 8: | Line 8: | ||
| restopic=Robot development | | restopic=Robot development | ||
| start=4/04/2013 | | start=4/04/2013 | ||
− | | end= | + | | end= 14/06/2013 |
− | | status= | + | | status=Closed |
| level = Bs | | level = Bs | ||
}} | }} | ||
== Goal == | == Goal == | ||
− | The goal of this project is to develop a | + | The goal of this project is to develop a robotic pet, which means letting the drone behave like a pet in some aspects. |
== Motivation == | == Motivation == | ||
− | Create a interactive | + | Create a interactive Robot based on AR.Drone. |
== Useful readings == | == Useful readings == | ||
Line 36: | Line 36: | ||
== Final result == | == Final result == | ||
− | + | Based on OpenCV, ROS and TLD algorithm, the AR.drone can detect multi objects and respond with the corresponding behaviors, as a pet. Following it's a detailed description: | |
− | + | Having learnt three objects (shoe, AIRLab trademark, and pizza box), | |
− | When the drone | + | When the drone sees the shoe, it will take off. |
− | When the drone | + | When the drone sees the trademark of AIRLab it will dance. |
+ | |||
+ | When the drone sees the pizza box it will land. | ||
+ | |||
+ | A video is available from YouTube. | ||
+ | |||
+ | |||
+ | |||
+ | {{#ev:youtube|u5opWIsHY-Q}} | ||
+ | |||
+ | *[http://youtu.be/u5opWIsHY-Q External link] |
Latest revision as of 13:40, 10 July 2013
PetDrone
| |
Short Description: | The goal of the project is to develop a drone able to interactive with people like a pet in some aspects. |
Coordinator: | AndreaBonarini (andrea.bonarini@polimi.it) |
Tutor: | AndreaBonarini (andrea.bonarini@polimi.it) |
Collaborator: | |
Students: | ZhizhongLi (zhizhong.li@mail.polimi.it) |
Research Area: | Robotics |
Research Topic: | Robot development |
Start: | 4/04/2013 |
End: | 14/06/2013 |
Status: | Closed |
Level: | Bs |
Goal
The goal of this project is to develop a robotic pet, which means letting the drone behave like a pet in some aspects.
Motivation
Create a interactive Robot based on AR.Drone.
Useful readings
ROS
- ardrone driver [1]
- gui controller, drone controller, and drone state estimation [2]
- Interfacing ROS with OpenCV [3]
Object Detection
- OpenCV, Feature Detection(SIFT/SURF)[4]
- OpenCV, Object Detection[5]
- Predator algorithm, TLD(Tracking-Learning-Detection)[6]
Final result
Based on OpenCV, ROS and TLD algorithm, the AR.drone can detect multi objects and respond with the corresponding behaviors, as a pet. Following it's a detailed description:
Having learnt three objects (shoe, AIRLab trademark, and pizza box),
When the drone sees the shoe, it will take off.
When the drone sees the trademark of AIRLab it will dance.
When the drone sees the pizza box it will land.
A video is available from YouTube.