Simulator for Autonomous Drone Flight and Data Collection in Agricultural Environments

PROJECT DESCRIPTION

With the development of smart farming, autonomous aerial robots have emerged as helpful tools in agricultural production processes. To gather high-quality data in an indoor environment, an aerial robot must be able to operate near the plants and traverse through the cluttered growing space to monitor plant growth. This increases the risk of collision, which often results in loss of the robot. However, developing algorithms that can generate safe motions in complicated environments via experiments can be complex, expensive, and time intensive. It is common for aerial robots to experience crashes that lead to hardware damages, and development speed can therefore be severely impeded. For this reason, we propose a simulation tool with a high-fidelity virtual replica of the desired agricultural environment, coupled with accurate physics and sensor models. The simulator allows for rapid testing and evaluation of ideas and flight algorithms before deploying the aerial robot for experimental flights as final validation. Furthermore, taking advantage of a high-fidelity virtual environment, the simulator can autonomously generate synthesized visual agricultural data that can be used to train neural networks used for plant monitoring and yield prediction.

RESEARCHERS

PI: Mark Mueller, GSR: Teaya Yang

This work is supported by the USDA/NSF AI Institute for Next Generation Food Systems (AIFS) through the AFRI Competitive Grant no. 2020-67021-32855/project accession no. 1024262 from the USDA National Institute of Food and Agriculture.