Autonomous Vehicular Robot for Aerial Monitoring Support
Poster Number
11
Introduction/Abstract
Aerial robots can support a wide range of activities from wetlands mapping to algae bloom tracking. However, the flight time of the robot is limited due to the combined trade-offs of robot weight, battery capacity, and power needs. This often results in 15-20 minute flight time of an autonomous aerial robot with a sensing payload. Expanding the functionality of aerial robots then requires either a redesign of the robots (into potentially less safe sizes), expanded use of people in transportation and recharging of the robots, or creative use of co-robots to automate the system further.
Purpose
Our system utilizes a mobile ground robot as support for the aerial robots, providing both transportation to the region of interest and recharging of the robots. Our prior work addresses inductive recharging of robots and sensors; this work focuses on the transportation challenges. Given our motivation of environmental monitoring situations, the ground robot will need to traverse rough terrain, avoiding obstacles and maintaining stability so as to not dislodge its aerial robot passengers. The problem then focuses on two aspects: (1) learning how to traverse the terrain and (2) automation of a large ground robot with the ability to self-stabilize. The first aspect begins with the assumption that the robot will drive on existing dirt tracks; this ensures the robot does not damage the environment by crushing the existing ecosystem (someone did that already). The robot will need to detect and follow the road while avoiding obstacles that may come into its path. To accomplish this, our system needs to perform three high-level functions: road detection, obstacle avoidance, and free space mapping. Our solution will be implemented using sensor fusion between multiple sensors such as LiDar, cameras, and rangefinders, while leveraging image processing techniques such as line detection, tracking, and background subtraction. The second aspect involves the automation of a golf cart. A golf cart is an ideal mobile base station for a robot as it is large enough to carry multiple aerial robots as well as sufficient batteries to recharge them without limiting its own range significantly. To enable deployment of said robots in a user interaction free manner, the golf cart requires automation. At a high level, the golf cart requires a GPS, for positioning itself throughout its operational area, and a close range mapping system, such as LIDAR, for choosing the best path through rough terrain. In order to interface with the golf cart’s normal controls, two main techniques are used. To actuate the steering and brakes, linear actuators are used on the steerable tow bar and brake pedal. To control the throttle, a simple DAC is used to simulate the output of the factory potentiometer.
Method
Evaluating our two aspects involves separate testing for now; the full combination of the system will be next year’s work. We verified the terrain traversal portion of the project, including the algorithmic and image processing portions, using carefully selected sample input datasets and analyzing the output that is produced. Using this method, we can verify that our algorithms work as promised without introducing the complex of actually driving the robot around. In the case of our line detection algorithms, we obtained reasonable sample pictures and compared how well each algorithm operated on the set of images. For automating our golf cart, we used an incremental development structure across each system that needs to be automated before integrating it across the whole system. Our efforts focused on, in order of complexity, the throttle, brake system, and steering system.
Results
For the algorithmic portion, our results provide a comparison between different line detecting algorithms that are presented in a way that can be used to evaluate their usefulness in navigating the golf cart. For the robotic automation portion, our resulting system is a golf cart that can be controlled using a well designed and intuitive software interface.
Significance
Automated monitoring of the environment supports and enables further scientific studies. In the context of wetlands monitoring, this allows scientists to better model the water flow through the environment and better understand the filtration effects of the wetlands on agricultural run-off. Providing this monitoring is an obvious task for aerial robots, but these robots face challenges in terms of flight time. By creating an automated solution to support aerial robots in monitoring tasks, our system reduces the scientists’ workload and increases the feasibility of such monitoring.
Location
DUC Ballroom A&B
Format
Poster Presentation
Poster Session
Afternoon
Autonomous Vehicular Robot for Aerial Monitoring Support
DUC Ballroom A&B
Aerial robots can support a wide range of activities from wetlands mapping to algae bloom tracking. However, the flight time of the robot is limited due to the combined trade-offs of robot weight, battery capacity, and power needs. This often results in 15-20 minute flight time of an autonomous aerial robot with a sensing payload. Expanding the functionality of aerial robots then requires either a redesign of the robots (into potentially less safe sizes), expanded use of people in transportation and recharging of the robots, or creative use of co-robots to automate the system further.