COVERAGE‑RECON

Coordinated Multi‑Drone Image Sampling with Online Map Feedback

Muhammad Hanif1, Reiji Terunuma1, Takumi Sumino1, Kelvin Cheng2, Takeshi Hatanaka1
1School of Engineering, Institute of Science Tokyo, Tokyo, JAPAN
2Rakuten Institute of Technology, Rakuten Group, Inc., Tokyo, JAPAN

Manuscript for IEEE Transactions on Control Systems Technology (TCST)

Coverage-Recon Animation
Coverage‑Recon enables autonomous multi-drone reconstruction through a closed feedback loop between image sampling and mapping. Drones use angle-aware coverage control with a QP-based controller to capture diverse viewpoints. Then, captured images by the drones are streamed and processed by NeuralRecon to generate an evolving 3D mesh that guides drones to focus on under-reconstructed regions.

Abstract

We address collaborative 3D map reconstruction with multiple drones. High-quality reconstruction requires capturing images of scene keypoints from diverse viewpoints; coverage control offers a scalable way to coordinate this process. Recent learning-based pipelines enable online 3D reconstruction, allowing map updates to guide motion during flight. We present Coverage-Recon, a coordinated image-sampling algorithm that closes the loop between sampling and mapping. Drones follow a QP-based, angle-aware coverage controller to ensure multi-view capture and safety. Images are fused online with NeuralRecon to produce an evolving mesh; localized mesh changes are interpreted as reconstruction uncertainty and fed back to adapt the coverage importance index. Unity–ROS2 simulations and real-world experiments show that Coverage-Recon achieves more complete and accurate reconstructions than methods without online feedback.

Key Concepts

Angle-Aware Coverage Control
Angle-Aware Coverage Control

Drones actively control both position and camera orientation (yaw and pitch), forming a 5-D control input to scan a 5-D virtual field of viewpoints. Each location in this field has an importance index: initially high (red) when unobserved, and progressively decreasing toward low (blue) as it is covered by the drone’s camera.

Online Map Feedback
Online Map Feedback (NeuralRecon)

Using the online 3D mesh generated by NeuralRecon, we extract mesh changes over time to identify under-reconstructed regions. Locations with large mesh updates are treated as high-importance areas, and this information is used to update the importance index. The QP-based controller then guides drones to revisit these regions while ensuring performance and safety via CBFs.

Simulation

Simulation of Coverage-Recon using map feedback (3D grid method) with 1 drone. The video shows the scene, evolving importance index, mesh changes, and reconstructed map.

Simulation of Coverage-Recon using map feedback (3D grid method) with 4 drones. The video shows coordinated coverage of the scene, shared importance index updates, mesh changes, and the evolving 3D map.

Objective J Comparison
Comparison of the global objective J over time for different team sizes (1, 2, and 4 drones) using map feedback (3D Grid). Using more drones accelerates convergence of J, demonstrating faster mission completion.

Experiment

Real-world indoor experiment of Coverage-Recon using map feedback (3D grid method) with 1 drone. The video demonstrates the scene, evolving importance index, and reconstructed 3D map in practice.

3D Map Comparison

Scene Photo

Scene (Photo)

Ours (No Map Feedback)

Ours (With Map Feedback)

BibTeX

@article{hanif2025coveragerecon,
      title   = {Coverage-Recon: Coordinated Multi-Drone Image Sampling with Online Map Feedback},
      author  = {Hanif, Muhammad and Terunuma, Reiji and Sumino, Takumi and Cheng, Kelvin and Hatanaka, Takeshi},
      journal = {IEEE Transactions on Control Systems Technology},
      year    = {2025},
      note    = {Manuscript}
    }