All posts by comitchell

Neuromatch Conference March 2020

Based on the successful mind-matching session at the Cognitive Computational Neuroscience (CCN) conference, a free web-based unconference for neuroscientists was created called “neuromatch“.

The neuromatch 1.0 conference was held on 30th and 31st March, 2020. The conference agenda included a significant number of international speakers.

Our ULTRACEPT researcher Xuelong Sun presented his work on insect navigation at the conference. Considering the current travel restrictions caused by Covid-19, this was an excellent opportunity to continue to promote the ULTRACEPT project work in an innovative, safe and effective way.

Neuromatch agenda image

Xuelong presented his work ‘A Decentralised Neural Model Explaining Optimal Integration Of Navigational Strategies in Insects’. Xuelong is carrying out this work with Dr Michael Mangan and Prof Shigang Yue.

A copy of Xuelong’s presentation can be accessed here.

To learn more about this research, please refer to the paper Modelling the Insect Navigation Toolkit: How the Mushroom Bodies and Central Complex Coordinate Guidance Strategies https://doi.org/10.1101/856153 .

Neuromatch conference agenda Mar 20
Xuelong Sun’s presentation Neuromatch conference agenda Mar 20

Development of an Angular Velocity Decoding Model Accounting for Honeybees’ Visually Guided Flights

Huatian Wang received his BSc and MSc degree in Applied Mathematics from Xi’an Jiaotong University in 2014 and 2017, respectively. He was awarded the Marie Curie Fellowship to be involved in the EU FP7 project LIVCODE (295151) as a Research Assistant in 2016.

Huatian enrolled as a PhD scholar at the University of Lincoln in January 2017. During his PhD, he carried out a 12 month secondment as an Early-Stage Researcher for the European Union’s Horizon 2020 STEP2DYNA (691154) project from 2017-18 at Tsinghua University. Following this, Huatian carried further secondments under the European Union’s Horizon 2020 ULTRACEPT (778062) project from 2019-2020. This included 1 month at Guangzhou University (GZHU), then 11 months at Xi’an Jiaotong University (XJTU). His research areas include image processing, insect vision and motion detection.Huatian Wang

I was mainly involved in the ULTRACEPT Work Package 1. The research focuses on modelling the visual processing systems of the flying insects like Drosophila and honeybees. Their extraordinary navigation ability in cluttered environments provide perfect inspiration for designing artificial neural networks. It can be used to guide the visual flight of micro air vehicles.

Although insects like flies and honeybees have tiny brains, they can deal with very complex visual flight tasks. Research has been undertaken for decades to understand how they detect visual motion. However, the neural mechanisms to explain their variety of behaviours, including patterned tunnel centring and terrain following, are still not clear. According to the honeybee behavioural experiments performed, the key to their excellent flight control ability is the angular velocity estimation and regulation.

To solve the fundamental problem of the angular velocity estimation, we proposed a novel angular velocity decoding model for explaining the honeybee’s flight behaviours of tunnel centring and terrain following, capable of reproducing observations of the large independence to the spatial frequency and contrast of the gratings in visually guide flights of honeybees. The model combines both temporal and texture information to decode the angular velocity. The angular velocity estimation of the model is little affected by the spatial frequency and contrast in synthetic grating experiments. The model is also tested behaviourally in Unity with the tunnel centring and terrain following paradigms. A demo video can be found on YouTube here. The simulated bee flies over a textured terrain using only ventral visual information to avoid collision.

During my secondment, I presented a poster as part of our work at the IJCNN 2019 conference in Budapest which you can read about here. This gave me the opportunity to share my research with the scientific community at the conference. The picture shows the communication I had with other researchers during the poster session.

Huatian Wang attending the International Joint Conference on Neural Networks (IJCNN) July 2019
Huatian Wang attending the International Joint Conference on Neural Networks (IJCNN) July 2019

I also attended and presented my work at the ULTRACEPT mid-term meeting in February 2020 which you can read about here. Due to Covid-19 travel restrictions, I was not able to attend the event in person. Instead, I attended and presented via video conference.

Huatian Wang presenting at the ULTRACEPT mid-term meeting Feb 2020
Huatian Wang presenting at the ULTRACEPT mid-term meeting Feb 2020

These secondments have provided me with the opportunity to work with leading academics in this field of research. For example, I was able to discuss the mathematical model of elementary motion detection and the signal simulation using sinusoidal gratings with Prof. Jigen Peng at GZHU, as well as the sparse reconstruction method in compressing sensing theory with Dr. Angang Cui at XJTU.

I also worked alongside fellow researchers. For example, I helped Dr. Qinbing to build up a database about the Collision Detection in various automotive scenes. We collected videos using a dashboard camera and made suitable cuts using video editing software.

I also attended numerous seminars and guest lectures. For example,  I attended a seminar on solving sparse linear system using smooth approximation methods. These experiences helped me to  develop my skills and knowledge and to further my research.

During the final two months of my secondment I had to work from my home in China since the university closed due to Covid-19. However, I was able to use this time to carry out video conference discussions with my supervisors both in Xian and Lincoln. I also used my desktop computer to run simulation experiments and spent time preparing academic research papers.

Thanks to the support of the ULTRACEPT project, I was able to introduce our work to other groups and attract their attention to this research field, which is helpful for improving the impact of our research.

 During my one-year secondment in China, I established a friendship with Prof. Peng and other colleagues at Guangzhou University and Xi’an Jiaotong University. The cooperation with colleagues of these institutions boosted the development of the neural modelling for visual navigation. I was also able to introduce ULTRACEPT Project to other researchers in GU and XJTU. The mathematical analysing ability has been significantly improved during the cooperation with Prof. Peng. The programming ability has also been improved with my colleagues’ help.

Guangdong ‘Zhongchuang Cup’ Entrepreneurship and Innovation Competition

The 2019 Guangdong “Zhongchuang Cup” Entrepreneurship and Innovation Competition was held from 18th to 19th  September 2019 in Jiangmen, China.

The aim of this post-doctoral innovation competition is to transform potential scientific research into business by developing doctoral talents. There were close to 400 attendees at the event which included the Ministry of Human Resources and Social Security, Guangdong Provincial Department of Human Resources and Social Security, relevant leaders of Jiangmen City, expert judges, and postdoctoral talents.

Cheng Hu from the University of Lincoln, who is currently on secondment at Guangzhou University, competed at the event where he showcased the Colias robot platform.

Cheng Hu at the Guangdong 'Zhongchuang Cup' Entrepreneurship and Innovation Competition
Cheng Hu at the Guangdong ‘Zhongchuang Cup’ Entrepreneurship and Innovation Competition

The postdoctoral innovation competition attracted 476 projects from six strategic emerging industries including biomedicine and general health, electronic information, new energy, energy conservation and environmental protection, new materials, the Internet and mobile Internet, and advanced manufacturing .

Out of the initial 476 projects, 328 entered the preliminary round. Of these, 60 outstanding projects advanced to the semi-finals, which included Cheng. As a result of advancing to the semi-finals, Cheng was invited to participate in innovative counselling training delivered by the National Postdoctoral Innovation (Jiangmen) Demonstration Center and Guangzhou Leading Human Resources Development Co. Ltd. Training included “Business Plan Writing” and “Business Project Roadshow Training”

Award winners at the Guangdong 'Zhongchuang Cup' Entrepreneurship and Innovation Competition
Award winners at the Guangdong ‘Zhongchuang Cup’ Entrepreneurship and Innovation Competition

From these 60 semi-finalists, 24 elite projects reached the finals. Cheng achieved an outstanding result and received the “winning award” ranking 12 of 30 in the semi-final, and 9 of 12 in the final.

Cheng Hu receiving award at the Guangdong 'Zhongchuang Cup' Entrepreneurship and Innovation Competition
Cheng Hu receiving award at the Guangdong ‘Zhongchuang Cup’ Entrepreneurship and Innovation Competition

Guangdong (Foshan) Doctoral and Postdoctoral Talent Exchange and Technology Project Matchmaking Conference

Whilst on secondment at ULTRACEPT partner Guangzhou University, Xuelong Sun and Dr Cheng Hu from the University of Lincoln attended the Guangdong (Foshan) Doctoral and Postdoctoral Talent Exchange and Technology Project Matchmaking Conference.

Each city of Guangdong province is provided a space in which to display their best innovation and entrepreneurship examples of 2019. As one of the city-governed universities, Guangzhou University was asked to select its most attractive and novel research to showcase at the  event. Xuelong and Cheng were recommended by Guangzhou University to give a demonstration of their ColCOSP system and Colias robot platform.

The event was hosted by the Guangdong Provincial Department of Human Resources and Social Security and the Foshan Municipal People’s Government. It was held on November 18, 2019 at the Tanzhou International Convention and Exhibition Center  in Foshan, China in the Guangdong-Hong Kong-Macao Greater Bay Area. Nearly 1,200 people from all over the world gathered in Foshan to participate. The opening ceremony was chaired by Qiao Yu, deputy mayor of the Foshan Municipal Government.

Xuelong Sun and Cheng Hu at the Guangdong Doctoral and Postdoctoral Talent Exchange and Technology Project Matchmaking Conference
Xuelong Sun and Cheng Hu at the Guangdong Doctoral and Postdoctoral Talent Exchange and Technology Project Matchmaking Conference

The conference lasted two days, with the theme of “bringing talents from all over the world and creating a talent bay area”, with the key aim of “promoting exchange and cooperation of post-doctoral talents and serving the transformation of post-doctoral scientific and technological achievements”.

At the event, Xuelong and Cheng gave a demonstration of their ColCOSP system and the Colias robot platform to the esteemed delegates and to the wider community of the Robotics and AI. They also exchanged some knowledge, ideas and interests concerning Robotics and AI with researchers from the related fields. Xuelong added  “It was a great chance to let more people know our platforms and researching supported by EU H2020 project”.

 

International Symposium on Crossmodal Learning in Humans and Robots November 2019

The International Symposium on Crossmodal Learning in Humans and Robots was held in at the Universität Hamburg in Hamburg, Germany on the 27 – 29 November 2019. You can access the symposium agenda here.

The Symposium included invited talks, short updates and research highlights from the CML project research projects, lab visits at the Computer Science campus, and a poster presentation with summaries from the first funding period (2016-2019). They also presented the research outlook for the second funding period (2020-2023), recently approved by the DFG.

This event included invited talks from our Ultracept Beneficiaries:

Wednesday, November 27, 2019

16:30-17:30 Dealing with Motion in the Dynamic World — from Insects’ Vision to Neuromorphic Sensors

  • Shigang Yue, University of Lincoln

Thursday, November 28, 2019

09:00-09:15 Transregional Collaboration Research on Crossmodal Learning in Artificial and Natural Cognitive Systems

  • Jianwei Zhang, Universität Hamburg

Friday, November 29, 2019

15:25-15:55 Torque and Visual Controlled Robot Dexterous Manipulations

  • Zhaopeng Chen, DLR/Agile Robots

 

IEEE 4th International Conference on Advanced Robotics and Mechatronics (ICARM) July 2019

The IEEE 4th International Conference on Advanced Robotics and Mechatronics (ICARM) conference was held in Toyonaka Campus, Osaka University, Osaka, Japan on the 3rd to 5th July 2019.  You can access the conference website here.

The 2019 conference was collaboratively organized by robotic researchers from Osaka University, The University of Tokyo, Nara Institute of Science and Technology, and Ritsumeikan University, Japan. The conference provided an international forum for researchers, educators, engineers in general areas of mechatronics, robotics, automation and sensors to disseminate their latest research results and exchange views on the future research directions of these fields.

Tian Liu presenting at the IEEE 4th International Conference on Advanced Robotics and Mechatronics (ICARM) July 2019
Tian Liu presenting at the IEEE 4th International Conference on Advanced Robotics and Mechatronics (ICARM) July 2019

This conference was attending by ULTRACEPT researcher Tian Liu from the University of Lincoln. Tian presented the following research:

ColCOSPhi: A Multiple Pheromone Communication System for Swarm Robotics and Social Insects Research
Sun, Xuelong University of Lincoln; Liu, Tian University of Lincoln; Hu, Cheng University of Lincoln; Fu, Qinbing University of Lincoln; Yue, Shigang University of Lincoln

Abstract: In the last few decades we have witnessed how the pheromone of social insect has become a rich inspiration source of swarm robotics. By utilising the virtual pheromone in physical swarm robot system to coordinate individuals and realise direct/indirect inter-robot communications like the social insect, stigmergic behaviour has emerged. However, many studies only take one single pheromone into account in solving swarm problems, which is not the case in real insects. In the real social insect world, diverse behaviours, complex collective performances and flexible transition from one state to another are guided by different kinds of pheromones and their interactions. Therefore, whether multiple pheromone based strategy can inspire swarm robotics research, and inversely how the performances of swarm robots controlled by multiple pheromones bring inspirations to explain the social insects’ behaviours will become an interesting question. Thus, to provide a reliable system to undertake the multiple pheromone study, in this paper, we specifically proposed and realised a multiple pheromone communication system called ColCOSPhi. This system consists of a virtual pheromone sub-system wherein the multiple pheromone is represented by a colour image displayed on a screen, and the micro-robots platform designed for swarm robotics applications. Two case studies are undertaken to verify the effectiveness of this system: one is the multiple pheromone based on an ant’s forage and another is the interactions of aggregation and alarm pheromones. The experimental results demonstrate the feasibility of ColCOSPhi and its great potential in directing swarm robotics and social insects research.

Tian Liu presenting at the IEEE 4th International Conference on Advanced Robotics and Mechatronics (ICARM) July 2019
Tian Liu presenting at the IEEE 4th International Conference on Advanced Robotics and Mechatronics (ICARM) July 2019

Dr Shyamala Doraisamy Featured in the EURAXESS ASEAN Newsletter

Shyamala Doraisamy is an Associate Professor at the Department of Multimedia, Faculty of Computer Science and Information Technology, University Putra Malaysia (UPM). UPM is a partner in the ULTRACEPT project and Dr. Doraisamy is UPM’s Partner Lead.

Dr. Doraisamy featured in the final EURAXESS ASEAN Newsletter for 2019. The article explained how UPM became involved in the ULTRACEPT project and what their role has been in the consortium.

Dr. Doraisamy received her PhD from Imperial College London in 2004, specializing in the field of Music Information Retrieval and won an award for her music and computing innovation at the Invention and New Product Exposition (INPEX), Pittsburgh, USA in 2007. Her research interest includes Multimedia Information Processing, focusing in particular on sound analysis and has completed several projects on music and health applications. She has been an invited speaker at various conference and research meetings internationally.

Dr. Doraisamy is an active member of the Malaysian Information Retrieval and Knowledge Management Society and was the Chair of the 2018 IEEE 2018 International Conference on Information Retrieval and Knowledge Management Conference (CAMP’18).

During 2019, Dr. Doraisamy has been on secondment at the University of Lincoln (UoL) along with Early Stage Researcher (ESR) Siavash Bahrami and Experienced Researcher (ER) Azreen Azman.

UPM secondees
Shyamala with UPM researchers Azreen Bahrami (L) and Siavash Bahrami (R)

UPM has been working on the theme of road safety related to the project. The tasks assigned to UPM were mainly based on the work packages, WP2, WP3 and WP4. The team has focused in particular on contributions for task 2.3 in WP2 – ‘To develop long range hazard perception methods coping with low light conditions’.

Dr. Doraisamy secondment has included initial meetings with partners and the completion of proposal discussions for a collaborative PhD research work with Siavash Bahrami. The tasks completed have been based on this collaborative PhD research being co-supervised by UPM and UoL. The research involves investigating the use of sound data for road wetness levels estimation to support the development of long range hazard methods coping with low light conditions.  You can read more about Siavash’s research here.

The team will continue to utilise audiovisual technologies towards the development of Brain-inspired vision systems for long-range hazard perception (WP2).

The article in the EURAXESS ASEAN Newsletter also highlights how participation in an MSCA-RISE can be beneficial for Malaysian research groups and Dr. Doraisamy also provides advice on getting involved in future RISE consortia.

 

International Joint Conference on Neural Networks (IJCNN) July 2019

The 2019 International Joint Conference on Neural Networks (IJCNN) was held at the InterContinental Budapest Hotel in Budapest, Hungary on the 14-19 July 2019. The full Program with Abstracts can be found here.

This conference was attended by  ULTRACEPT researchers from the University of Lincoln, Huatian Wang and Hongxin Wang.

Neural Models of Perception, Cognition and Action

Tuesday, July 16, 5:30PM-7:30PM

Hongxin Wang presented the following:

Visual Cue Integration for Small Target Motion Detection in Natural Cluttered Backgrounds [#19188]

Hongxin Wang, Jigen Peng, Qinbing Fu, Huatian Wang and Shigang Yue, University of Lincoln, United Kingdom; Guangzhou University, China.  

The robust detection of small targets against cluttered background is important for future artificial visual systems in searching and tracking applications. The insects’ visual systems have demonstrated excellent ability to avoid predators, find prey or identify conspecifics – which always appear as small dim speckles in the visual field. Build a computational model of the insects’ visual pathways could provide effective solutions to detect small moving targets. Although a few visual system models have been proposed, they only make use of small-field visual features for motion detection and their detection results often contain a number of false positives. To address this issue, we develop a new visual system model for small target motion detection against cluttered moving backgrounds. Compared to the existing models, the small-field and wide-field visual features are separately extracted by two motion-sensitive neurons to detect small target motion and background motion. These two types of motion information are further integrated to filter out false positives. Extensive experiments showed that the proposed model can outperform the existing models in terms of detection rates.

Hongxin Wang presenting 'Visual Cue Integration for Small Target Motion Detection in Natural Cluttered Backgrounds'
Hongxin Wang presenting ‘Visual Cue Integration for Small Target Motion Detection in Natural Cluttered Backgrounds’ at the Conference on Neural Networks (IJCNN) July 2019
Plenary Poster Session POS2: Poster Session 2

Thursday, July 18, 10:00AM-11:40AM

Huatian Wang presented the following:

P333 Angular Velocity Estimation of Image Motion Mimicking the Honeybee Tunnel Centring Behaviour [#19326]

Huatian Wang, Qinbing Fu, Hongxin Wang, Jigen Peng, Paul Baxter, Cheng Hu and Shigang Yue, University of Lincoln, United Kingdom; Guangzhou University, China

Insects use visual information to estimate angular velocity of retinal image motion, which determines a variety of flight behaviours including speed regulation, tunnel centring and visual navigation. For angular velocity estimation, honeybees show large spatial-independence against visual stimuli, whereas the previous models have not fulfilled such an ability. To address this issue, we propose a bio-plausible model for estimating the image motion velocity based on behavioural experiments of the honeybee flying through patterned tunnels. The proposed model contains mainly three parts, the texture estimation layer for spatial information extraction, the delay-and-correlate layer for temporal information extraction and the decoding layer for angular velocity estimation. This model produces responses that are largely independent of the spatial frequency in grating experiments. And the model has been implemented in a virtual bee for tunnel centring simulations. The results coincide with both electro-physiological neuron spike and behavioural path recordings, which indicates our proposed method provides a better explanation of the honeybee’s image motion detection mechanism guiding the tunnel centring behaviour.

Huatian Wang attending the International Joint Conference on Neural Networks (IJCNN) July 2019
Huatian Wang presenting his poster ‘Angular Velocity Estimation of Image Motion Mimicking the Honeybee Tunnel Centring Behaviour’ at the Conference on Neural Networks (IJCNN) July 2019

UK Neural Computation July 2019

The 2019 UK Neural Computation event was held at the University of Nottingham, United Kingdom on the 2nd -3rd July 2019. The full programme can be viewed here.

This event was attended by ULTRACEPT researchers from the University of Lincoln Hongxin Wang, Jiannan  Zhao, Fang Lei, Hao Luan and Xuelong Sun.

As well as attending presentations at the event, the researchers attended a tutorial for early PhD student where they learnt a lot about doing research.

Hongxin states “I attended the tutorial,  communicated with researchers who worked on relevant fields such as  computational neuroscience,  and acquired new ideals for further improving robustness of the STMD models and how to simulate feedback mechanism in the STMD neural pathways.”

Jiannan advised that he participated in the meeting discussion, discussed the interesting topic in Neural Computing field.

Modelling the optimal integration of navigational strategies in the insect brain

Xuelong Sun presented a poster at this event. You can view Xuelong’s poster here.

Sun X, Mangan M, Yue S 

Insect are expert navigators capable of searching out sparse food resources over large ranges in complex habitats before relocating their often hidden nesting sites. These feats are all the more impressive given the limited sensing and processing available to individual animals. Recently, significant advances have been made in identifying the brain areas driving specific navigational  behaviours, and their functioning, but an overarching computational model remains elusive. In this study, we present the first biologically constrained, computational model that integrates visual homing, visual compass and path integration behaviours. Specifically, we demonstrate the challenges faced when attempting to replicate visual navigation behaviours (visual compass and visual homing) using the known mushroom body anatomy (MB) and instead propose that the central 54 complex (CX) neuropil may instead compute the visual compass. We propose that the role of the mushroom body (MB) is to modulate the weighting of the path integration and visual guidance systems depending on the current context (e.g. in a familiar or unfamiliar visual surrounding). Finally, we demonstrate that optimal integration of directional cues can be achieved using a biologically realistic ring attractor network.

Xuelong Sun at the UK Neural Computation 2019 in Nottingham 'Modelling the optimal integration of navigational strategies in the insect brain'
Xuelong Sun at the UK Neural Computation 2019 in Nottingham ‘Modelling the optimal integration of navigational strategies in the insect brain’

 

 

15th International Conference on Artificial Intelligence Applications and Innovations (AIAI) May 2019

The 2019 15th International Conference on Artificial Intelligence Applications and Innovations AIAI was held on the 24th -26th May 2019  in Crete, Greece at the Knossos Royal Beach Resort. The detailed program can be found here.

The conference was attended by the ULTRACEPT researchers Hongxin Wang and Jiannan Zhao from the University of Lincoln and Xingzao Ma from Lingnan Normal University.

Jiannan Zhao presenting 'An LGMD Based Competitive Collision Avoidance Strategy for UAV' at the 15th International Conference on Artificial Intelligence Applications and Innovations (AIAI) May 2019
Jiannan Zhao presenting ‘An LGMD Based Competitive Collision Avoidance Strategy for UAV’ at the 15th International Conference on Artificial Intelligence Applications and Innovations (AIAI) May 2019
AIAI Session 2: (AUV-LE) Autonomous Vehicles-Learning

Jiannan Zhao presented at this conference.

Room B: 10:30- 11:45

Jiannan Zhao, Xingzao Ma, Qinbing Fu, Cheng Hu, Shigang Yue

An LGMD Based Competitive Collision Avoidance Strategy for UAV

Abstract. Building a reliable and efficient collision avoidance system For unmanned aerial vehicles (UAVs) is still a challenging problem. This research takes inspiration from locusts, which can y in dense swarms for hundreds of miles without collision. In the locust’s brain, a visual pathway of LGMD-DCMD (lobula giant movement detector and descending contra-lateral motion detector) has been identified as collision perception system guiding fast collision avoidance for locusts, which is ideal for designing artificial vision systems. However, there is very few works investigating its potential in real-world UAV applications. In this paper, we present an LGMD based competitive collision avoidance method for UAV indoor navigation. Compared to previous works, we divided the UAV’s field of view into four subfields each handled by an LGMD neuron. Therefore, four individual competitive LGMDs (C-LGMD) compete for guiding the directional collision avoidance of UAV. With more degrees of freedom compared to ground robots and vehicles, the UAV can escape from collision along four cardinal directions (e.g. the object approaching from the left-side triggers a rightward shifting of the UAV). Our proposed method has been validated by both simulations and real-time quadcopter arena experiments.

Jiannan Zhao presenting 'An LGMD Based Competitive Collision Avoidance Strategy for UAV' at the 15th International Conference on Artificial Intelligence Applications and Innovations (AIAI) May 2019
Jiannan Zhao presenting ‘An LGMD Based Competitive Collision Avoidance Strategy for UAV’ at the 15th International Conference on Artificial Intelligence Applications and Innovations (AIAI) May 2019
AIAI Session 10: (AG-MV) Agents-Machine Vision

Hongxin Wang presented at this conference.

Room C: 12:00-13:15

Constant Angular Velocity Regulation for Visually Guided Terrain Following

Huatian Wang, Qinbing Fu, Hongxin Wang, Jigen Peng, Shigang Yue

Insects use visual cues to control their flight behaviours. By estimating the angular velocity of the visual stimuli and regulating it to a constant value, honeybees can perform a terrain following task which keeps the certain height above the undulated ground. For mimicking this behaviour in a bio-plausible computation structure, this paper presents a new angular velocity decoding model based on the honeybee’s behavioural experiments. The model consists of three parts, the texture estimation layer for spatial information extraction, the motion detection layer for temporal information extraction and the decoding layer combining information from pervious layers to estimate the angular velocity. Compared to previous methods on this field, the proposed model produces responses largely independent of the spatial frequency and contrast in grating experiments. The angular velocity based control scheme is proposed to implement the model into a bee simulated by the game engine Unity. The perfect terrain following above patterned ground and successfully flying over irregular textured terrain show its potential for micro unmanned aerial vehicles’ terrain following.