Tag Archives: Publication

Xuelong Sun presents at Neuromatch Conference March 2020

Based on the successful mind-matching session at the Cognitive Computational Neuroscience (CCN) conference, a free web-based unconference for neuroscientists was created called “neuromatch“.

The neuromatch 1.0 conference was held on 30th and 31st March, 2020. The conference agenda included a significant number of international speakers.

Our ULTRACEPT researcher Xuelong Sun presented his work on insect navigation at the conference. Considering the current travel restrictions caused by Covid-19, this was an excellent opportunity to continue to promote the ULTRACEPT project work in an innovative, safe and effective way.

Neuromatch agenda image

Xuelong presented his work ‘A Decentralised Neural Model Explaining Optimal Integration Of Navigational Strategies in Insects’. Xuelong is carrying out this work with Dr Michael Mangan and Prof Shigang Yue.

A copy of Xuelong’s presentation can be accessed here.

To learn more about this research, please refer to the paper Modelling the Insect Navigation Toolkit: How the Mushroom Bodies and Central Complex Coordinate Guidance Strategies https://doi.org/10.1101/856153 .

Neuromatch conference agenda Mar 20
Xuelong Sun’s presentation Neuromatch conference agenda Mar 20

ULTRACEPT researchers invited to speak at International Symposium on Crossmodal Learning in Humans and Robots November 2019

The International Symposium on Crossmodal Learning in Humans and Robots was held in at the Universität Hamburg in Hamburg, Germany on the 27 – 29 November 2019. You can access the symposium agenda here.

The Symposium included invited talks, short updates and research highlights from the CML project research projects, lab visits at the Computer Science campus, and a poster presentation with summaries from the first funding period (2016-2019). They also presented the research outlook for the second funding period (2020-2023), recently approved by the DFG.

This event included invited talks from our ULTRACEPT Beneficiaries:

Wednesday, November 27, 2019

16:30-17:30 Dealing with Motion in the Dynamic World — from Insects’ Vision to Neuromorphic Sensors

  • Shigang Yue, University of Lincoln

Thursday, November 28, 2019

09:00-09:15 Transregional Collaboration Research on Crossmodal Learning in Artificial and Natural Cognitive Systems

  • Jianwei Zhang, Universität Hamburg

Friday, November 29, 2019

15:25-15:55 Torque and Visual Controlled Robot Dexterous Manipulations

  • Zhaopeng Chen, DLR/Agile Robots

 

Tian Liu presents at IEEE 4th International Conference on Advanced Robotics and Mechatronics (ICARM) July 2019

The IEEE 4th International Conference on Advanced Robotics and Mechatronics (ICARM) conference was held in Toyonaka Campus, Osaka University, Osaka, Japan on the 3rd to 5th July 2019.  You can access the conference website here.

The 2019 conference was collaboratively organized by robotic researchers from Osaka University, The University of Tokyo, Nara Institute of Science and Technology, and Ritsumeikan University, Japan. The conference provided an international forum for researchers, educators, engineers in general areas of mechatronics, robotics, automation and sensors to disseminate their latest research results and exchange views on the future research directions of these fields.

Tian Liu presenting at the IEEE 4th International Conference on Advanced Robotics and Mechatronics (ICARM) July 2019
Tian Liu presenting at the IEEE 4th International Conference on Advanced Robotics and Mechatronics (ICARM) July 2019

This conference was attending by ULTRACEPT researcher Tian Liu from the University of Lincoln. Tian presented the following research:

X. Sun, T. Liu, C. Hu, Q. Fu and S. Yue, “ColCOS Φ: A Multiple Pheromone Communication System for Swarm Robotics and Social Insects Research,” 2019 IEEE 4th International Conference on Advanced Robotics and Mechatronics (ICARM), Toyonaka, Japan, 2019, pp. 59-66, doi: 10.1109/ICARM.2019.8833989.

Abstract: In the last few decades we have witnessed how the pheromone of social insect has become a rich inspiration source of swarm robotics. By utilising the virtual pheromone in physical swarm robot system to coordinate individuals and realise direct/indirect inter-robot communications like the social insect, stigmergic behaviour has emerged. However, many studies only take one single pheromone into account in solving swarm problems, which is not the case in real insects. In the real social insect world, diverse behaviours, complex collective performances and flexible transition from one state to another are guided by different kinds of pheromones and their interactions. Therefore, whether multiple pheromone based strategy can inspire swarm robotics research, and inversely how the performances of swarm robots controlled by multiple pheromones bring inspirations to explain the social insects’ behaviours will become an interesting question. Thus, to provide a reliable system to undertake the multiple pheromone study, in this paper, we specifically proposed and realised a multiple pheromone communication system called ColCOSPhi. This system consists of a virtual pheromone sub-system wherein the multiple pheromone is represented by a colour image displayed on a screen, and the Colias IV micro-robots platform designed for swarm robotics applications. Two case studies are undertaken to verify the effectiveness of this system: one is the multiple pheromone based on an ant’s forage and another is the interactions of aggregation and alarm pheromones. The experimental results demonstrate the feasibility of ColCOSPhi and its great potential in directing swarm robotics and social insects research.

Tian Liu presenting at the IEEE 4th International Conference on Advanced Robotics and Mechatronics (ICARM) July 2019
Tian Liu presenting at the IEEE 4th International Conference on Advanced Robotics and Mechatronics (ICARM) July 2019

International Joint Conference on Neural Networks (IJCNN) July 2019

The 2019 International Joint Conference on Neural Networks (IJCNN) was held at the InterContinental Budapest Hotel in Budapest, Hungary on the 14-19 July 2019. The full Program with Abstracts can be found here.

This conference was attended by  ULTRACEPT researchers from the University of Lincoln, Huatian Wang and Hongxin Wang.

Neural Models of Perception, Cognition and Action

Tuesday, July 16, 5:30PM-7:30PM

Hongxin Wang presented the following:

Visual Cue Integration for Small Target Motion Detection in Natural Cluttered Backgrounds [#19188]

Hongxin Wang, Jigen Peng, Qinbing Fu, Huatian Wang and Shigang Yue, University of Lincoln, United Kingdom; Guangzhou University, China.  

The robust detection of small targets against cluttered background is important for future artificial visual systems in searching and tracking applications. The insects’ visual systems have demonstrated excellent ability to avoid predators, find prey or identify conspecifics – which always appear as small dim speckles in the visual field. Build a computational model of the insects’ visual pathways could provide effective solutions to detect small moving targets. Although a few visual system models have been proposed, they only make use of small-field visual features for motion detection and their detection results often contain a number of false positives. To address this issue, we develop a new visual system model for small target motion detection against cluttered moving backgrounds. Compared to the existing models, the small-field and wide-field visual features are separately extracted by two motion-sensitive neurons to detect small target motion and background motion. These two types of motion information are further integrated to filter out false positives. Extensive experiments showed that the proposed model can outperform the existing models in terms of detection rates.

Hongxin Wang presenting 'Visual Cue Integration for Small Target Motion Detection in Natural Cluttered Backgrounds'
Hongxin Wang presenting ‘Visual Cue Integration for Small Target Motion Detection in Natural Cluttered Backgrounds’ at the Conference on Neural Networks (IJCNN) July 2019
Plenary Poster Session POS2: Poster Session 2

Thursday, July 18, 10:00AM-11:40AM

Huatian Wang presented the following:

P333 Angular Velocity Estimation of Image Motion Mimicking the Honeybee Tunnel Centring Behaviour [#19326]

Huatian Wang, Qinbing Fu, Hongxin Wang, Jigen Peng, Paul Baxter, Cheng Hu and Shigang Yue, University of Lincoln, United Kingdom; Guangzhou University, China

Insects use visual information to estimate angular velocity of retinal image motion, which determines a variety of flight behaviours including speed regulation, tunnel centring and visual navigation. For angular velocity estimation, honeybees show large spatial-independence against visual stimuli, whereas the previous models have not fulfilled such an ability. To address this issue, we propose a bio-plausible model for estimating the image motion velocity based on behavioural experiments of the honeybee flying through patterned tunnels. The proposed model contains mainly three parts, the texture estimation layer for spatial information extraction, the delay-and-correlate layer for temporal information extraction and the decoding layer for angular velocity estimation. This model produces responses that are largely independent of the spatial frequency in grating experiments. And the model has been implemented in a virtual bee for tunnel centring simulations. The results coincide with both electro-physiological neuron spike and behavioural path recordings, which indicates our proposed method provides a better explanation of the honeybee’s image motion detection mechanism guiding the tunnel centring behaviour.

Huatian Wang attending the International Joint Conference on Neural Networks (IJCNN) July 2019
Huatian Wang presenting his poster ‘Angular Velocity Estimation of Image Motion Mimicking the Honeybee Tunnel Centring Behaviour’ at the Conference on Neural Networks (IJCNN) July 2019

UK Neural Computation July 2019

The 2019 UK Neural Computation event was held at the University of Nottingham, United Kingdom on the 2nd -3rd July 2019. The full programme can be viewed here.

This event was attended by ULTRACEPT researchers from the University of Lincoln Hongxin Wang, Jiannan  Zhao, Fang Lei, Hao Luan and Xuelong Sun.

As well as attending presentations at the event, the researchers attended a tutorial for early PhD student where they learnt a lot about doing research.

Hongxin states “I attended the tutorial,  communicated with researchers who worked on relevant fields such as  computational neuroscience,  and acquired new ideals for further improving robustness of the STMD models and how to simulate feedback mechanism in the STMD neural pathways.”

Jiannan advised that he participated in the meeting discussion, discussed the interesting topic in Neural Computing field.

Modelling the optimal integration of navigational strategies in the insect brain

Xuelong Sun presented a poster at this event. You can view Xuelong’s poster here.

Sun X, Mangan M, Yue S 

Insect are expert navigators capable of searching out sparse food resources over large ranges in complex habitats before relocating their often hidden nesting sites. These feats are all the more impressive given the limited sensing and processing available to individual animals. Recently, significant advances have been made in identifying the brain areas driving specific navigational  behaviours, and their functioning, but an overarching computational model remains elusive. In this study, we present the first biologically constrained, computational model that integrates visual homing, visual compass and path integration behaviours. Specifically, we demonstrate the challenges faced when attempting to replicate visual navigation behaviours (visual compass and visual homing) using the known mushroom body anatomy (MB) and instead propose that the central 54 complex (CX) neuropil may instead compute the visual compass. We propose that the role of the mushroom body (MB) is to modulate the weighting of the path integration and visual guidance systems depending on the current context (e.g. in a familiar or unfamiliar visual surrounding). Finally, we demonstrate that optimal integration of directional cues can be achieved using a biologically realistic ring attractor network.

Xuelong Sun at the UK Neural Computation 2019 in Nottingham 'Modelling the optimal integration of navigational strategies in the insect brain'
Xuelong Sun at the UK Neural Computation 2019 in Nottingham ‘Modelling the optimal integration of navigational strategies in the insect brain’

 

 

ULTRACEPT Researchers Present at 15th International Conference on Artificial Intelligence Applications and Innovations (AIAI) May 2019

The 2019 15th International Conference on Artificial Intelligence Applications and Innovations AIAI was held on the 24th -26th May 2019  in Crete, Greece at the Knossos Royal Beach Resort. The detailed program can be found here.

The conference was attended by the ULTRACEPT researchers Hongxin Wang and Jiannan Zhao from the University of Lincoln and Xingzao Ma from Lingnan Normal University.

Jiannan Zhao presenting 'An LGMD Based Competitive Collision Avoidance Strategy for UAV' at the 15th International Conference on Artificial Intelligence Applications and Innovations (AIAI) May 2019
Jiannan Zhao presenting ‘An LGMD Based Competitive Collision Avoidance Strategy for UAV’ at the 15th International Conference on Artificial Intelligence Applications and Innovations (AIAI) May 2019
AIAI Session 2: (AUV-LE) Autonomous Vehicles-Learning

Jiannan Zhao presented at this conference.

Room B: 10:30- 11:45

Jiannan Zhao, Xingzao Ma, Qinbing Fu, Cheng Hu, Shigang Yue

An LGMD Based Competitive Collision Avoidance Strategy for UAV

Abstract. Building a reliable and efficient collision avoidance system For unmanned aerial vehicles (UAVs) is still a challenging problem. This research takes inspiration from locusts, which can y in dense swarms for hundreds of miles without collision. In the locust’s brain, a visual pathway of LGMD-DCMD (lobula giant movement detector and descending contra-lateral motion detector) has been identified as collision perception system guiding fast collision avoidance for locusts, which is ideal for designing artificial vision systems. However, there is very few works investigating its potential in real-world UAV applications. In this paper, we present an LGMD based competitive collision avoidance method for UAV indoor navigation. Compared to previous works, we divided the UAV’s field of view into four subfields each handled by an LGMD neuron. Therefore, four individual competitive LGMDs (C-LGMD) compete for guiding the directional collision avoidance of UAV. With more degrees of freedom compared to ground robots and vehicles, the UAV can escape from collision along four cardinal directions (e.g. the object approaching from the left-side triggers a rightward shifting of the UAV). Our proposed method has been validated by both simulations and real-time quadcopter arena experiments.

Jiannan Zhao presenting 'An LGMD Based Competitive Collision Avoidance Strategy for UAV' at the 15th International Conference on Artificial Intelligence Applications and Innovations (AIAI) May 2019
Jiannan Zhao presenting ‘An LGMD Based Competitive Collision Avoidance Strategy for UAV’ at the 15th International Conference on Artificial Intelligence Applications and Innovations (AIAI) May 2019
AIAI Session 10: (AG-MV) Agents-Machine Vision

Hongxin Wang presented at this conference.

Room C: 12:00-13:15

Constant Angular Velocity Regulation for Visually Guided Terrain Following

Huatian Wang, Qinbing Fu, Hongxin Wang, Jigen Peng, Shigang Yue

Insects use visual cues to control their flight behaviours. By estimating the angular velocity of the visual stimuli and regulating it to a constant value, honeybees can perform a terrain following task which keeps the certain height above the undulated ground. For mimicking this behaviour in a bio-plausible computation structure, this paper presents a new angular velocity decoding model based on the honeybee’s behavioural experiments. The model consists of three parts, the texture estimation layer for spatial information extraction, the motion detection layer for temporal information extraction and the decoding layer combining information from pervious layers to estimate the angular velocity. Compared to previous methods on this field, the proposed model produces responses largely independent of the spatial frequency and contrast in grating experiments. The angular velocity based control scheme is proposed to implement the model into a bee simulated by the game engine Unity. The perfect terrain following above patterned ground and successfully flying over irregular textured terrain show its potential for micro unmanned aerial vehicles’ terrain following.