Category Archives: DISSEMINATION

International Joint Conference on Neural Networks (IJCNN) July 2019

The 2019 International Joint Conference on Neural Networks (IJCNN) was held at the InterContinental Budapest Hotel in Budapest, Hungary on the 14-19 July 2019. The full Program with Abstracts can be found here.

This conference was attended by  ULTRACEPT researchers from the University of Lincoln, Huatian Wang and Hongxin Wang.

Neural Models of Perception, Cognition and Action

Tuesday, July 16, 5:30PM-7:30PM

Hongxin Wang presented the following:

Visual Cue Integration for Small Target Motion Detection in Natural Cluttered Backgrounds [#19188]

Hongxin Wang, Jigen Peng, Qinbing Fu, Huatian Wang and Shigang Yue, University of Lincoln, United Kingdom; Guangzhou University, China.  

The robust detection of small targets against cluttered background is important for future artificial visual systems in searching and tracking applications. The insects’ visual systems have demonstrated excellent ability to avoid predators, find prey or identify conspecifics – which always appear as small dim speckles in the visual field. Build a computational model of the insects’ visual pathways could provide effective solutions to detect small moving targets. Although a few visual system models have been proposed, they only make use of small-field visual features for motion detection and their detection results often contain a number of false positives. To address this issue, we develop a new visual system model for small target motion detection against cluttered moving backgrounds. Compared to the existing models, the small-field and wide-field visual features are separately extracted by two motion-sensitive neurons to detect small target motion and background motion. These two types of motion information are further integrated to filter out false positives. Extensive experiments showed that the proposed model can outperform the existing models in terms of detection rates.

Hongxin Wang presenting 'Visual Cue Integration for Small Target Motion Detection in Natural Cluttered Backgrounds'
Hongxin Wang presenting ‘Visual Cue Integration for Small Target Motion Detection in Natural Cluttered Backgrounds’ at the Conference on Neural Networks (IJCNN) July 2019
Plenary Poster Session POS2: Poster Session 2

Thursday, July 18, 10:00AM-11:40AM

Huatian Wang presented the following:

P333 Angular Velocity Estimation of Image Motion Mimicking the Honeybee Tunnel Centring Behaviour [#19326]

Huatian Wang, Qinbing Fu, Hongxin Wang, Jigen Peng, Paul Baxter, Cheng Hu and Shigang Yue, University of Lincoln, United Kingdom; Guangzhou University, China

Insects use visual information to estimate angular velocity of retinal image motion, which determines a variety of flight behaviours including speed regulation, tunnel centring and visual navigation. For angular velocity estimation, honeybees show large spatial-independence against visual stimuli, whereas the previous models have not fulfilled such an ability. To address this issue, we propose a bio-plausible model for estimating the image motion velocity based on behavioural experiments of the honeybee flying through patterned tunnels. The proposed model contains mainly three parts, the texture estimation layer for spatial information extraction, the delay-and-correlate layer for temporal information extraction and the decoding layer for angular velocity estimation. This model produces responses that are largely independent of the spatial frequency in grating experiments. And the model has been implemented in a virtual bee for tunnel centring simulations. The results coincide with both electro-physiological neuron spike and behavioural path recordings, which indicates our proposed method provides a better explanation of the honeybee’s image motion detection mechanism guiding the tunnel centring behaviour.

Huatian Wang attending the International Joint Conference on Neural Networks (IJCNN) July 2019
Huatian Wang presenting his poster ‘Angular Velocity Estimation of Image Motion Mimicking the Honeybee Tunnel Centring Behaviour’ at the Conference on Neural Networks (IJCNN) July 2019

UK Neural Computation July 2019

The 2019 UK Neural Computation event was held at the University of Nottingham, United Kingdom on the 2nd -3rd July 2019. The full programme can be viewed here.

This event was attended by ULTRACEPT researchers from the University of Lincoln Hongxin Wang, Jiannan  Zhao, Fang Lei, Hao Luan and Xuelong Sun.

As well as attending presentations at the event, the researchers attended a tutorial for early PhD student where they learnt a lot about doing research.

Hongxin states “I attended the tutorial,  communicated with researchers who worked on relevant fields such as  computational neuroscience,  and acquired new ideals for further improving robustness of the STMD models and how to simulate feedback mechanism in the STMD neural pathways.”

Jiannan advised that he participated in the meeting discussion, discussed the interesting topic in Neural Computing field.

Modelling the optimal integration of navigational strategies in the insect brain

Xuelong Sun presented a poster at this event. You can view Xuelong’s poster here.

Sun X, Mangan M, Yue S 

Insect are expert navigators capable of searching out sparse food resources over large ranges in complex habitats before relocating their often hidden nesting sites. These feats are all the more impressive given the limited sensing and processing available to individual animals. Recently, significant advances have been made in identifying the brain areas driving specific navigational  behaviours, and their functioning, but an overarching computational model remains elusive. In this study, we present the first biologically constrained, computational model that integrates visual homing, visual compass and path integration behaviours. Specifically, we demonstrate the challenges faced when attempting to replicate visual navigation behaviours (visual compass and visual homing) using the known mushroom body anatomy (MB) and instead propose that the central 54 complex (CX) neuropil may instead compute the visual compass. We propose that the role of the mushroom body (MB) is to modulate the weighting of the path integration and visual guidance systems depending on the current context (e.g. in a familiar or unfamiliar visual surrounding). Finally, we demonstrate that optimal integration of directional cues can be achieved using a biologically realistic ring attractor network.

Xuelong Sun at the UK Neural Computation 2019 in Nottingham 'Modelling the optimal integration of navigational strategies in the insect brain'
Xuelong Sun at the UK Neural Computation 2019 in Nottingham ‘Modelling the optimal integration of navigational strategies in the insect brain’

 

 

ULTRACEPT Researchers Present at 15th International Conference on Artificial Intelligence Applications and Innovations (AIAI) May 2019

The 2019 15th International Conference on Artificial Intelligence Applications and Innovations AIAI was held on the 24th -26th May 2019  in Crete, Greece at the Knossos Royal Beach Resort. The detailed program can be found here.

The conference was attended by the ULTRACEPT researchers Hongxin Wang and Jiannan Zhao from the University of Lincoln and Xingzao Ma from Lingnan Normal University.

Jiannan Zhao presenting 'An LGMD Based Competitive Collision Avoidance Strategy for UAV' at the 15th International Conference on Artificial Intelligence Applications and Innovations (AIAI) May 2019
Jiannan Zhao presenting ‘An LGMD Based Competitive Collision Avoidance Strategy for UAV’ at the 15th International Conference on Artificial Intelligence Applications and Innovations (AIAI) May 2019
AIAI Session 2: (AUV-LE) Autonomous Vehicles-Learning

Jiannan Zhao presented at this conference.

Room B: 10:30- 11:45

Jiannan Zhao, Xingzao Ma, Qinbing Fu, Cheng Hu, Shigang Yue

An LGMD Based Competitive Collision Avoidance Strategy for UAV

Abstract. Building a reliable and efficient collision avoidance system For unmanned aerial vehicles (UAVs) is still a challenging problem. This research takes inspiration from locusts, which can y in dense swarms for hundreds of miles without collision. In the locust’s brain, a visual pathway of LGMD-DCMD (lobula giant movement detector and descending contra-lateral motion detector) has been identified as collision perception system guiding fast collision avoidance for locusts, which is ideal for designing artificial vision systems. However, there is very few works investigating its potential in real-world UAV applications. In this paper, we present an LGMD based competitive collision avoidance method for UAV indoor navigation. Compared to previous works, we divided the UAV’s field of view into four subfields each handled by an LGMD neuron. Therefore, four individual competitive LGMDs (C-LGMD) compete for guiding the directional collision avoidance of UAV. With more degrees of freedom compared to ground robots and vehicles, the UAV can escape from collision along four cardinal directions (e.g. the object approaching from the left-side triggers a rightward shifting of the UAV). Our proposed method has been validated by both simulations and real-time quadcopter arena experiments.

Jiannan Zhao presenting 'An LGMD Based Competitive Collision Avoidance Strategy for UAV' at the 15th International Conference on Artificial Intelligence Applications and Innovations (AIAI) May 2019
Jiannan Zhao presenting ‘An LGMD Based Competitive Collision Avoidance Strategy for UAV’ at the 15th International Conference on Artificial Intelligence Applications and Innovations (AIAI) May 2019
AIAI Session 10: (AG-MV) Agents-Machine Vision

Hongxin Wang presented at this conference.

Room C: 12:00-13:15

Constant Angular Velocity Regulation for Visually Guided Terrain Following

Huatian Wang, Qinbing Fu, Hongxin Wang, Jigen Peng, Shigang Yue

Insects use visual cues to control their flight behaviours. By estimating the angular velocity of the visual stimuli and regulating it to a constant value, honeybees can perform a terrain following task which keeps the certain height above the undulated ground. For mimicking this behaviour in a bio-plausible computation structure, this paper presents a new angular velocity decoding model based on the honeybee’s behavioural experiments. The model consists of three parts, the texture estimation layer for spatial information extraction, the motion detection layer for temporal information extraction and the decoding layer combining information from pervious layers to estimate the angular velocity. Compared to previous methods on this field, the proposed model produces responses largely independent of the spatial frequency and contrast in grating experiments. The angular velocity based control scheme is proposed to implement the model into a bee simulated by the game engine Unity. The perfect terrain following above patterned ground and successfully flying over irregular textured terrain show its potential for micro unmanned aerial vehicles’ terrain following.

ULTRACEPT Project Coordinator Shigang Yue invited to present at the International VDI Conference, Munich, February 2019

International VDI conference – Automotive Sensor Systems – Munich, Germany – 13-14 February 2019

Project co-ordinator, Professor Shigang Yue, was invited to speak at the International VDI (Association of German Engineers) Conference on Automotive Sensor Systems, which covered the growing importance of sensing technologies in the context of Automated and Autonomous Driving. The conference focused on innovative sensor technologies for automated and autonomous driving.

The International VDI Conference – Automotive Sensor Systems informs about the future requirements of important sensor technologies. Industry experts presented solutions for testing, simulating and validating ADAS sensor systems. The event provided insights into the role of sensors in aspects of functional safety and cyber security. In addition, the conference provided an overview of the most important trends and legal issues in the ADAS sensor market.

Main topics of the conference included:

  • Challenges & Trends in ADAS Sensing Technologies (Radar, Lidar, Camera, Ultrasonic)
  • Innovations and future Requirements for Sensor Systems
  • Functional Safety and Automotive Security Challenges
  • The Role of AI in processing and interpreting Sensor Data
  • The regulatory Framework & Development of Standards
  • Excursus: Driver Monitoring
Professor Yue’s talk, as described on the conference agenda

“I was delighted to be invited to speak at the Automotive Sensor Systems Conference about the ULTRACEPT project, which aims to provide robust collision detection methods for road safety. I spoke of the research highlights of my group in the area of insect inspired algorithms and the steps required to bring it to the point of Industry application. It was a pleasure to meet key industry personnel and talk about our research outcomes supported by the ULTRACEPT project.” Shigang Yue, February 2019.