PTAM on AR.Drone

In our project we have implemented the PTAM (Parallel Tracking and Mapping) Algorithm on a commercial AR.Drone Quadcopter for the purposes of orientation and automatic navigation

Abstract

In our project we have implemented the PTAM (Parallel Tracking and Mapping) Algorithm on a commercial AR.Drone Quadcopter for the purposes of orientation and automatic navigation in a closed space. The greatness of the algorithm is that it performs the environment mapping and the tracking in real-time and simultaneously. The tracking and the mapping are done using the quadcopter own built-in camera, and so, we overcome some major issues compared to tracking with external camera, especially with tracking precision and pose estimation. After the use of the algorithm we got good pose estimation of the Quadcopter location in the room. Then, we used an implementation of the Kalman Filter and data obtained from the Quadcopter IMU (inertial measurement unit) to estimate the Quadcopter pose even better. In the last stage of the project, we defined a routes in the room that the Quadcopter should follow, and with the use of a PID controller, the Quadcopter followed these routes in precise. Getting good results required us to fine tune the parameters of the algorithm, the filter and of the PID controller.

 

PTAM – Parallel Tracking And Mapping

  • Tracking:

1

  • Mapping

2

 

Results

Square Route

3