Pulse-coupled neural network performance for real-time identification of vegetation during forced landing
High Performance Computing and Research Support, Queensland University of Technology, Queensland 4001, Australia; School of Electrical Engineering and Computer Science, Queensland University of Technology, Queensland 4001, Australia; Australian Research Center for Aerospace Automation, Queensland University of Technology, Queensland 4001, Australia
ANZIAM J. 55 (EMAC2013) pp.C1–C16, 2014
@article{warne2014pulse,
title={Pulse-coupled neural network performance for real-time identification of vegetation during forced landing},
author={Warne, David James and Hayward, Ross and Kelson, Neil and Banks, Jasmine and Mejias, Luis},
journal={ANZIAM Journal},
volume={55},
pages={C1–C16},
year={2014}
}
Safety concerns in the operation of autonomous aerial systems require safe-landing protocols be followed during situations where the mission should be aborted due to mechanical or other failure. This article presents a pulse-coupled neural network (PCNN) to assist in the vegetation classification in a vision-based landing site detection system for an unmanned aircraft. We propose a heterogeneous computing architecture and an OpenCL implementation of a PCNN feature generator. Its performance is compared across OpenCL kernels designed for CPU, GPU, and FPGA platforms. This comparison examines the compute times required for network convergence under a variety of images to determine the plausibility for real-time feature detection.
March 28, 2014 by hgpu