IEEE at UC San Diego

IARC Quadcoptor

About the Competition


Unmanned Aerial Vehicles (UAVs) were originally designed for military applications, but have expanded to many other applications such as commercial, recreational and surveillance. UAVs operate with a varying degree of autonomy, ranging from being piloted remotely by a human to piloted autonomously through on-board computing. While robust methods for object detection and avoidance are critical to this task, tracking and interaction with moving objects has not been proven thoroughly as of yet. Autonomy without external aids such as Global Positioning Systems (GPS) for positioning is becoming ever more paramount for indoor or military applications where GPS may be limited or unavailable. Mission 7 of the International Aerial Robotic Competition (IARC) seeks to explore robust methods of interacting with moving obstacles while building the foundation for indoor localization.


In the first part of Mission 7 of IARC, Mission 7a, participants must develop a fully autonomous UAV to track randomly moving objects and interact physically with them in order to guide them across the field to a designated location while avoiding dynamic obsta- cles. The indoor area consists of a field that is 20x20 meters, with grid lines every meter. The field is populated with 14 Roombas, 4 of which are obstacles with up to 2 meter vertical poles while the remaining 10 move in a semi-random pattern and have paddles on top which allow interaction.

The problem has 3 key main challenges, localizing the aircraft, detection and tracking of obstacles and targets, and optimized planning and control in order to complete the mission within a time constraint. Because of the clear problem definitions, we split up our solution into 3 main areas. The first is Self-Estimation, localizes the aircraft’s absolute position relative to the grid. The next area of focus is Perception, identification and tracking of objects, along with determining the objects position relative to the aircraft. The third focus is Planning, which generates high-level trajectory commands to take to optimize herding.


SYSTEM OVERVIEW


AIRFRAME

The airframe chosen for mission 7a was a 640 class quadcopter. Because efforts were focused on software, a commercial airframe that included electronic speed controls, motors and propellers designed for light duty cinematography was chosen. This ensures a lightweight, well developed package designed for long flight times with additional payload.

PROPULSION

Propulsion is achieved with four Hobbywing XRotor 40A ESCs powering T-Motor AntiGrav- ity MN4006, 380kv motors paired with 15 inch Tarot 1555 foldable propellers. This motor and propeller combination can produce a maximum total thrust of 8.9 kilograms while con- suming 360 Watts, far above aircraft weight. Although, these motors are extremely efficient at low RPM, consuming only 75 Watts at 50% throttle.

POWER DISTRIBUTION

The aircraft is powered by a 6 Cell, 22.2V 4500mah Lithium Polymer (LiPo) battery. The battery chosen was the largest watt hour pack, at 99.9Wh, we were allowed to bring on an airline. Without the sensor package, the vehicle’s flight time is 35 minutes at 1.83kg. With the full payload, the operational flight time is an estimated 20 minutes. Two voltage regulators are on the aircraft. One is used to supply 5V to the companion computer, USB camera and high power WiFi dongle. The second is used to supply 5V, propulsion current draw and battery voltage to the flight controller.



SENSORS

The Ardupilot flight control stack was originally built with GPS for holding a position. Since we cannot use GPS for positioning, holding a position in a hover is achieved with a PX4Flow optical flow camera which is solely used for the low-level flight control of the Arducopter flight stack and serves as a safety redundancy in the event of a loss of communication. The Pixhawk flight controller includes one barometer, two 3-axis gyroscope, two 3-axis accelerometers and a 3-axis magnetometer. Due to the built in barometer poor accuracy, we opted for lidar for altitude. We chose a LIDAR-Lite v3 for altitude which has a range from 0-40m and an accuracy up to 2.5cm. A down-facing camera with a wide angle lens is mounted to the bottom of the aircraft for vision needs.

COMMUNICATION

There are two primary communication links from the aircraft to the ground station. The first communication link is primarily intended for triggering of the emergency safety switch but also serves as a method of switching the aircraft from manual mode to autonomous mode. This link is a standard 2.4GHz spread spectrum radio controller and receiver. The second communication link in our system supports all communications between the aircraft and the ground station computer, including video streaming, telemetry, sensor data and aircraft commands. This link is through a 802.11b/g/n WiFi Link using an USB WiFi Module with a maximum bandwidth of 150Mbps.

SOFTWARE

The software component of the aircraft is built on a Raspberry Pi 3 running rasbian, with a ROS and MAVROS software stack to enable remote communication and live streaming to a nearby groundstation. The Ardupilot flight control stack handles low-level flight controls for determining flight path and control of the vehicle, while the majority of the computation is handled offboard via a nearby groundstation running Ubuntu 14.04 with MAVROS.

PERCEPTION


We perceive the environment entirely through a downwards facing camera with a large field of view. The problem in which perception solves is the detection and tracking of objects of interest within the arena. Here is where the computer vision suite resides, processing one frame at a time from the camera feed to solve these tasks. Additionally, the use of external sensory data from the Pixhawk enables additional information in which our tracking algorithm can utilize in order to develop more precise estimates for any particular object. The information calculated from each frame is than forwarded to our path planning and state estimation suites in order to react to and perform a variety of decision making tasks.


The roombas have very distinct colored features, with red and green paddles, therefore a threshold is performed over a range in the HSV colors-space to identify an agent of interest. This was preceded by a Gaussian Blur in order to remove small perturbations within the frame. Its pixel coordinates are than calculated by locating the center of its contour and a conversion is performed.

PLANNING


Once our planning algorithm chooses a roomba to ”score,” it moves towards the roomba, and follows the roomba from above. This following algorithm utilizes two Proportional-Integral- Derivative controllers(PID), one for Roll and the other for Pitch. Using the roomba position information output by the perception stack, we are able to calculate the PID coefficients and implement the controllers. By tuning the PID gains, we are able to adjust the response of the aircraft to a change in position of the roomba, resulting in a smooth and accurate following of the roomba.


The landing algorithm immediately proceeds the ”Following” algorithm, where once we believe we have an accurate ”follow” of the roomba, our quadcopter can slowly descend onto the roomba, pushing the button. We then increase our altitude until we can detect the state and heading of the roomba, such that either we land once more or return to a follow, such that it moves towards the goal line.

SELF-ESTIMATION


The UAV depends on information in its surroundings such as visual location and its relative orientation. We can exploit the uniformity of the 20m by 20m square grid being compromised of smaller 1m by 1m squares to localize the UAV.


Grid detection is made possible through a few steps. First, we apply a white threshold to highlight the grid lines. Then we apply a canny edge detector to identify the edges of the lines. Next, a hough line transform is done in order to find the location and angles of the grid lines. Since we identified the location and angle of the grid lines from the hough transform, we can exploit the hough space. Using this, we can localize ourselves relative to our location on the grid by counting how many grid lines we traversed, both vertically and horizontally.

SIMULATION


In the absence of a physical or finished aircraft, and for safety reasons, initial testing needed to be done on a simulation. We developed a software in the loop simulation in order to develop our software package. The aircraft’s software stack exists within the ROS framework, which integrates directly with the 3D simulation software, Gazebo.


Gazebo allows numerous customizations to be done to allow incredible flexibility in robot simulation. It also supports generation of measurements from a multitude of sensors, allowing customization for additive noise. Gazebo provides extensive documentation and examples of how to implement and simulate various situations. They provide direct access to their API with the use of plugins, allowing for development of control software for robots, sensors and even the environment.

Our simulation is an amalgamation of various packages, working seemlessly to deliver an aesthetically pleasing, highly functional, easily customizable, simulation environment for ROS software development. It integrates the Ardupilot SITL plugin, MAVROS, MAVLink/MAVProxy, ROS/GAZEBO, and many other packages, including our own software stack.

TEST FLIGHT


TEAM MEMBERS


GEORGE LACHOW

Team Lead

ALEX KHOURY

Simulation, Planning

JASON QUACH

Perception

ERIC HO

Self-Estimation

ALI ZOLFAGHARI

Hardware

IAN SCHROEDER

Software Team Lead, Perception

JAMES ECKDAHL

Hardware

JEFFERY HERBERT

Hardware

JUNRU REN

Self-Estimation

CONTACT

Contact us and we'll get back to you within 24 hours.

La Jolla, CA

+1 760 828 8894

georgelachow@gmail.com