This project is funded by the Office of Naval Research Global (ONRG).
More LinksIn this project, we explore how a team of diverse robots can collaboratively monitor complex environments, such as bustling seaports or major city events. Equipped with varied sensors like cameras and microphones, each robot gathers data from its unique perspective. While some advancements exist in robots reaching consensus on basic features, integrating high-level reasoning with diverse sensing remains a challenge. This is essential when deploying a team of robots in dynamic environments for autonomous navigation in applications like search and rescue, multi-view videography and inspection.
One contribution from this project is an interaction and obstacle aware trajectory prediction model which when combined with a model predictive controller(MPC) achieves multi-robot motion planning. This is done by building a neural network model which is trained on a dataset of robot trajectories generated using a simulator. This model is used to predict the planning behavior of robots and help provide robot trajectory predictions in a multi-robot scenario. This is then fed to an MPC framework which is used as the local motion planner. Experiments were conducted using a team of quadrotors which had to fly in a space shared with human obstacles. The quadrotors were able to keep a safe distance to each human obstacle while following the proposed trajectory.
This project also addresses the problem of videography drone teams that have to autonomously capture desired shots of a dynamic target in a complex environment. A two-stage planning pipeline is proposed whereby a high level planner uses a visibility heuristic to choose when each drone should capture which shot. This results in a reference trajectory which is tracked by an online Model Predictive Control (MPC) algorithm which uses a cost function for viewpoint parameters. Demonstrations are then conducted for a videography scenario with a pair of drones assigned to capture shots of a remote controlled car in the presence of obstacles.
This project is funded by the Office of Naval Research Global (ONRG).
Distributed multi-target tracking and active perception with mobile camera networks
In S.I. Collaborative Mobile Smart Cameras, Computer Vision and Image Understanding,
2024.
Learning scalable and efficient communication policies for multi-robot collision avoidance
In Autonomous Robots 47, 1275-1297,
2023.
Active Classification of Moving Targets with Learned Control Policies
In IEEE Robotics and Automation Letters (RA-L),
2023.
RAST: Risk-Aware Spatio-Temporal Safety Corridors for MAV Navigation in Dynamic Uncertain Environments
In IEEE Robotics and Automation Letters (RA-L),
2023.
A Framework for Fast Prototyping of Photo-realistic Environments with Multiple Pedestrians
In , in IEEE Int. Conf. on Robotics and Automation (ICRA),
2023.
Wi-Closure: Reliable and Efficient Search of Inter-Robot Loop Closures Using Wireless Sensing
In IEEE Int. Conf. on Robotics and Automation (ICRA),
2023.
Decentralized Probabilistic Multi-Robot Collision Avoidance Using Buffered Uncertainty-Aware Voronoi Cells
In Autonomous Robots (AURO),
2022.
Learning Interaction-Aware Trajectory Predictions for Decentralized Multi-Robot Motion Planning in Dynamic Environments
In , IEEE Robotics and Automation Letters (RA-L),
2021.
Multi-robot Task Assignment for Aerial Tracking with Viewpoint Constraints
In Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS),
2021.
Online Informative Path Planning for Active Information Gathering of a 3D Surface
In Proc. IEEE Int. Conf. on Robotics and Automation (ICRA),
2021.
With Whom to Communicate: Learning Efficient Communication for Multi-Robot Collision Avoidance
In Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS),
2020.
Robust Vision-based Obstacle Avoidance for Micro Aerial Vehicles in Dynamic Environments
In Proc. IEEE Int. Conf. on Robotics and Automation (ICRA),
2020.
B-UAVC: Buffered Uncertainty-Aware Voronoi Cells for Probabilistic Multi-Robot Collision Avoidance
In Proc. 2nd IEEE International Symposium on Multi-Robot and Multi-Agent Systems (MRS'19),
2019.