In the first part of Mission 7 of IARC, Mission 7a, participants must develop a fully autonomous UAV to track randomly moving objects and interact physically with them in order to guide them across the field to a designated location while avoiding dynamic obsta- cles. The indoor area consists of a field that is 20x20 meters, with grid lines every meter. The field is populated with 14 Roombas, 4 of which are obstacles with up to 2 meter vertical poles while the remaining 10 move in a semi-random pattern and have paddles on top which allow interaction.
The problem has 3 key main challenges, localizing the aircraft, detection and tracking of obstacles and targets, and optimized planning and control in order to complete the mission within a time constraint. Because of the clear problem definitions, we split up our solution into 3 main areas. The first is Self-Estimation, localizes the aircraft’s absolute position relative to the grid. The next area of focus is Perception, identification and tracking of objects, along with determining the objects position relative to the aircraft. The third focus is Planning, which generates high-level trajectory commands to take to optimize herding.
The airframe chosen for mission 7a was a 640 class quadcopter. Because efforts were focused on software, a commercial airframe that included electronic speed controls, motors and propellers designed for light duty cinematography was chosen. This ensures a lightweight, well developed package designed for long flight times with additional payload.
Propulsion is achieved with four Hobbywing XRotor 40A ESCs powering T-Motor AntiGrav- ity MN4006, 380kv motors paired with 15 inch Tarot 1555 foldable propellers. This motor and propeller combination can produce a maximum total thrust of 8.9 kilograms while con- suming 360 Watts, far above aircraft weight. Although, these motors are extremely efficient at low RPM, consuming only 75 Watts at 50% throttle.
The aircraft is powered by a 6 Cell, 22.2V 4500mah Lithium Polymer (LiPo) battery. The battery chosen was the largest watt hour pack, at 99.9Wh, we were allowed to bring on an airline. Without the sensor package, the vehicle’s flight time is 35 minutes at 1.83kg. With the full payload, the operational flight time is an estimated 20 minutes. Two voltage regulators are on the aircraft. One is used to supply 5V to the companion computer, USB camera and high power WiFi dongle. The second is used to supply 5V, propulsion current draw and battery voltage to the flight controller.
The Ardupilot flight control stack was originally built with GPS for holding a position. Since we cannot use GPS for positioning, holding a position in a hover is achieved with a PX4Flow optical flow camera which is solely used for the low-level flight control of the Arducopter flight stack and serves as a safety redundancy in the event of a loss of communication. The Pixhawk flight controller includes one barometer, two 3-axis gyroscope, two 3-axis accelerometers and a 3-axis magnetometer. Due to the built in barometer poor accuracy, we opted for lidar for altitude. We chose a LIDAR-Lite v3 for altitude which has a range from 0-40m and an accuracy up to 2.5cm. A down-facing camera with a wide angle lens is mounted to the bottom of the aircraft for vision needs.
There are two primary communication links from the aircraft to the ground station. The first communication link is primarily intended for triggering of the emergency safety switch but also serves as a method of switching the aircraft from manual mode to autonomous mode. This link is a standard 2.4GHz spread spectrum radio controller and receiver. The second communication link in our system supports all communications between the aircraft and the ground station computer, including video streaming, telemetry, sensor data and aircraft commands. This link is through a 802.11b/g/n WiFi Link using an USB WiFi Module with a maximum bandwidth of 150Mbps.
The software component of the aircraft is built on a Raspberry Pi 3 running rasbian, with a ROS and MAVROS software stack to enable remote communication and live streaming to a nearby groundstation. The Ardupilot flight control stack handles low-level flight controls for determining flight path and control of the vehicle, while the majority of the computation is handled offboard via a nearby groundstation running Ubuntu 14.04 with MAVROS.
The roombas have very distinct colored features, with red and green paddles, therefore a threshold is performed over a range in the HSV colors-space to identify an agent of interest. This was preceded by a Gaussian Blur in order to remove small perturbations within the frame. Its pixel coordinates are than calculated by locating the center of its contour and a conversion is performed.
Grid detection is made possible through a few steps. First, we apply a white threshold to highlight the grid lines. Then we apply a canny edge detector to identify the edges of the lines. Next, a hough line transform is done in order to find the location and angles of the grid lines. Since we identified the location and angle of the grid lines from the hough transform, we can exploit the hough space. Using this, we can localize ourselves relative to our location on the grid by counting how many grid lines we traversed, both vertically and horizontally.
Gazebo allows numerous customizations to be done to allow incredible flexibility in robot simulation. It also supports generation of measurements from a multitude of sensors, allowing customization for additive noise. Gazebo provides extensive documentation and examples of how to implement and simulate various situations. They provide direct access to their API with the use of plugins, allowing for development of control software for robots, sensors and even the environment.
Our simulation is an amalgamation of various packages, working seemlessly to deliver an aesthetically pleasing, highly functional, easily customizable, simulation environment for ROS software development. It integrates the Ardupilot SITL plugin, MAVROS, MAVLink/MAVProxy, ROS/GAZEBO, and many other packages, including our own software stack.
Software Team Lead, Perception
Contact us and we'll get back to you within 24 hours.
La Jolla, CA
+1 760 828 8894