Huatian Wang publishes paper in Neural Networks

Huatian Wang Publishes Paper in Neural Networks

Huatian Wang enrolled as a PhD scholar at the University of Lincoln in January 2017. During his PhD, he carried out a 12-month secondment as an Early-Stage Researcher for the European Union’s Horizon 2020 STEP2DYNA (691154) project from 2017-18 at Tsinghua University. Following this, Huatian carried further secondments under the European Union’s Horizon 2020 ULTRACEPT (778062) project from 2019-2020. This included 1 month at Guangzhou University (GZHU), then 11 months at Xi’an Jiaotong University (XJTU). His research areas include image processing, insect vision and motion detection.

University of Lincoln researcher Huatian Wang recently published a paper titled “A bioinspired angular velocity decoding neural network model for visually guided flights” on Neural Networks. Neural Networks is the archival journal of the world’s three oldest neural modeling societies: the International Neural Network Society (INNS), the European Neural Network Society (ENNS), and the Japanese Neural Network Society (JNNS). It has a significant influence on neuroscience, especially on cognitive neuroscience.

Huatian Wang publishes paper in Neural Networks

About the Paper

Efficient and robust motion perception systems are important pre-requisites for achieving visually guided flights in future micro air vehicles. As a source of inspiration, the visual neural networks of flying insects such as honeybee and Drosophila provide ideal examples on which to base artificial motion perception models. In our paper “A bioinspired angular velocity decoding neural network model for visually guided flights”, we have used this approach to develop a novel method that solves the fundamental problem of estimating angular velocity for visually guided flights.

Compared with previous models, our elementary motion detector (EMD) based model uses a separate texture estimation pathway to effectively decode angular velocity, and demonstrates considerable independence from the spatial frequency and contrast of the gratings.

Using the Unity development platform the model is further tested for tunnel centering and terrain following paradigms in order to reproduce the visually guided flight behaviors of honeybees. In a series of controlled trials, the virtual bee utilizes the proposed angular velocity control schemes to accurately navigate through a patterned tunnel, maintaining a suitable distance from the undulating textured terrain. The results are consistent with both neuron spike recordings and behavioral path recordings of real honeybees, thereby demonstrating the model’s potential for implementation in micro air vehicles which only have visual sensors.

About the Research Experience

Huatian shares his recent research experience which contributed to this publication. 

2020 was a difficult year for all of us. After a one-year secondment in China funded by the EU HORIZON 2020 project, ULTRACEPT, I had to stay in China due to the travel restriction. Thanks to the university’s policy, I could apply to work remotely at home to continue my research. My supervisor, Prof Shigang Yue, organized an online group meeting every week so that we could talk with each other freely. This benefited my study a lot and I was able to make progress every week and update my research regularly.

Publication on Neural Networks is an encouragement for me to continue my research on modeling visual systems of insects. Thanks for the support I received from the ULTRACEPT project and for the kind support from my supervisor Prof Shigang and my research colleagues.

Huatian Wang publishes paper in Neural Networks
A Group Meeting Photo

This paper is available as open access:

Huatian Wang, Qinbing Fu, Honxing Wang, Paul Baxter, Jigen Peng, Shigang Yue, A bioinspired angular velocity decoding neural network model for visually guided flights, Neural Networks,
2020, ISSN 0893-6080, (

Leave a Reply

Your email address will not be published. Required fields are marked *