Contributed talk
in
Hybrid Life 3,
July 31, 2019, 11:30 a.m.
in room
NUBS 2.03
Insect-Inspired Visual Navigation On-Board an Autonomous Robot: Real-World Routes Encoded in a Single Layer Network
James Knight, Daniil Sakhapov, Norbert Domcsek, Alex Dewar, Paul Graham, Thomas Nowotny, Andrew Philippides
watch
Publication
Inspired by the behaviour of ants, models of visually-guided navigation that operate by scanning for previously experienced views of the world have been shown to be capable of robust route navigation. These algorithms operate by using views experienced along a training route to train an artificial neural network (ANN) to output a measure of the familiarity of any new view. Hence we refer to them as familiarity-based navigation algorithms. In this paper we show that an ANN with an Infomax learning rule is capable of delivering reliable direction information, even when scenes contain few local landmarks and high-levels of noise (from changing lighting conditions, uneven terrain etc). Indeed, routes can be precisely recapitulated, with our robot straying an average of only 10 from the 6 training path. Additionally, we show that the required computation does not increase with the number of training views and thus the ANN provides a compact representation of the knowledge needed to traverse a route. Finally, rather than this compact representation necessarily losing information, there are instances where the use of an ANN ameliorates the problems of sub-optimal paths caused by very tortuous training routes. Our results therefore suggest the feasibility of familiarity-based navigation for long-range autonomous visual homing.