Following our decision to enter the UAV Outback Challenge (OBC) in 2014, our first consideration was the choice of autopilot system. Autopilot failures appear to be frequent in the OBC, and given the challenge inherent in controlling relatively small aircraft in potentially high winds we considered this the most critical component of the system.
A number of COTS offerings are available, with varying degrees of automation and integration with specific airframes. Since the OBC rules disallow usage of a complete airframe + autopilot package, we also had to consider the difficulty of tuning the autopilot for our specific airframe.
The most interesting option was the PX4 Autopilot, developed by the Eidgenössiche Technische Hochschule Zürich (ETHZ) and commercialised by 3D Robotics (manufacturers of the ArduPilot boards). The PX4 is based on an ARM Cortex M4 CPU running at 168MHz, and has a full set of consumer-grade sensors on the board. However, the software for this device was unproven, and while ETHZ have an excellent reputation the community was not especially large at the time.
The ArduPilot APM2.5+ module was a more proven system, with excellent support and some OBC pedigree (although it should be noted that the most successful team included an ArduPilot core developer; there’s no guarantee mere mortals can get the same results). The fact that it works so well is quite frankly amazing to me, given it runs on a 16MHz 8-bit MCU with 8KB RAM. The sensors are essentially the same as the PX4, although the much less capable MCU would limit maximum sample rates.
However, the ArduPilot’s position/attitude update rate and output capability appears to be extremely limited; we calculated that the time between updates could easily introduce georeferencing errors on the order of a couple of metres even without considering latency and jitter effects. Also, we intended to use a smaller airframe than CanberraUAV had, and it wasn’t clear that ArduPilot would be capable of dealing with the increased impact of wind and the payload drop on flight dynamics.
Ultimately we decided it would be considerably more fun to develop our own system, and none of the main off-the-shelf alternatives appeared to have an insurmountable advantage in technology or support.
The first step was to divide the hardware engineering effort up to minimise overall risk of failure—individual modules which could be replaced with an off-the-shelf solution if we weren’t able to make ours work in time. So, we decided on the following systems:
As a result of performance testing, we later decided the Exynos4412 (quad-core 1.7GHz ARM) was not going to be powerful enough to run our AHRS and control software; the CPU board requirements were therefore revised to include a Texas Instruments TMS320C6657 dual-core 1.25GHz DSP, with best-case performance of 40GFLOPS (10GFLOPS double-precision). This would have the added advantages of not requiring a real-time GNU/Linux installation on the Exynos, and enabling much faster flight control system restarts.
So, the AHRS and flight control systems would run on the DSP, receiving dual 1000Hz sensor data feeds from the I/O boards, and a lower-bandwidth link would run between the DSP and the Exynos module for high-level mission control (new waypoints, configuration updates) and data logging. The DSP would be responsible for triggering the main (downward-facing) camera, while the Exynos module would read the image data and match the image up with the relevant position and attitude data (timing accurate to within two milliseconds, or about 20cm at our expected altitude and maximum rotation rates).
For the other systems, we decided to use proven COTS parts: