All posts by comitchell

Mu Hua Presents ‘Investigating Refractoriness in Collision Perception Neural Model’ at IJCNN 2021

Mu Hua is a post-graduate student at the University of Lincoln and working on ULTRACEPT’s work package 1.

IJCNN 2021

University of Lincoln researcher Mu Hua attended and presented at the International Joint Conference on Neural Networks 2021 (IJCNN 2021) which was held from 18th to 22nd July 2021. Although originally scheduled to be held in Shenzhen, China, due to the ongoing international travel disruption caused by Covid-19, the conference was moved online.

IJCNN 2021 is the flagship annual conference of the International Neural Network Society (INNS) – the premiere organisation for individuals interested in a theoretical and computational understanding of the brain and applying that knowledge to develop new and more effective forms of machine intelligence. INNS was formed in 1987 by the leading scientists in the Artificial Neural Networks (ANN) field. The conference promotes all aspects of neural networks theory, analysis and applications.

This year IJCNN received 1183 papers submitted from over 77 different countries. Of these, 1183 papers, 59.3% were accepted. All of them are included in the program as virtual oral presentations. The top ten countries where the submitting authors come from are (in descending order): China, United Sates, India, Brazil, Australia, United Kingdom, Germany, Japan, Italy, Brazil, Japan, Italy and France. The event was attended by more than 1166 participants and featured special sessions, plenary talks, competitions, tutorials, and workshops.

Representing the University of Lincoln, Mu Hua presented his paper Mu Hua, Qinbing Fu, Wenting Duan, Shigang Yue “Investigating Refractoriness in Collision Perception Neural Network”, (IJCNN 2021) with a poster demonstrating that numerical modelling refractory period, a common neuronal phenomenon, can a promising way to enhance the stability of currently LGMD neural network for collision perception.

Figure 1: (a) Refractoriness schematic diagram. The orange curve shows the change of membrane potential. Depolarization and repolarization are represented by dashed line with arrow. ARP corresponds to depolarization and part of repolarization while RRP is covered by hyper-polarization. (b) shows the curve of ( Pt(x, y) − Lt(x, y) ) when a single stimulus is applied at 1st frame, which resembles the real membrane potential curve during RP.
Figure 2: Snapshots of 389th frame from original video and Gaussian-noise-contaminated video. The orange curve represents LGMD membrane potential with our proposed RP mechanism, comparatively blue one without RP. While most of the blue curve stays at 1, orange curve can be easily distinguished for the peak at 401st frame with violent fluctuation within first 40 frames.

Abstract

Currently, collision detection methods based on visual cues are still challenged by several factors including ultra-fast approaching velocity and noisy signal. Taking inspiration from nature, though the computational models of lobula giant movement detectors (LGMDs) in locust’s visual pathways have demonstrated positive impacts on addressing these problems, there remains potential for improvement. In this paper, we propose a novel method mimicking neuronal refractoriness, i.e. the refractory period (RP), and further investigate its functionality and efficacy in the classic LGMD neural network model for collision perception. Compared with previous works, the two phases constructing RP, namely the absolute refractory period (ARP) and relative refractory period (RRP) are computationally implemented through a ‘link (L) layer’ located between the photoreceptor and the excitation layers to realise the dynamic characteristic of RP in discrete time domain. The L layer, consisting of local time-varying thresholds, represents a sort of mechanism that allows photoreceptors to be activated individually and selectively by comparing the intensity of each photoreceptor to its corresponding local threshold established by its last output. More specifically, while the local threshold can merely be augmented by larger output, it shrinks exponentially over time. Our experimental outcomes show that, to some extent, the investigated mechanism not only enhances the LGMD model in terms of reliability and stability when faced with ultra-fast approaching objects, but also improves its performance against visual stimuli polluted by Gaussian or Salt-Pepper noise. This research demonstrates the modelling of refractoriness is effective in collision perception neuronal models, and promising to address the aforementioned collision detection challenges.

This paper can be freely accessed on the University of Lincoln Institutional Repository Eprints.

A Time-Delay Feedback Neural Network for Discriminating Small, Fast-Moving Targets in Complex Dynamic Environments

Hongxin Wang received his PhD degree in computer science from the University of Lincoln, UK, in 2020. Following a secondment under the STEP2DYNA project, Dr Wang carried out a further secondment under the ULTRACEPT project from April 2020 to April 2021 at partner Guangzhou University where he undertook research contributing to work packages 1 and 2. Dr Wang’s ULTRACEPT contributions have involved directing the research into computational modelling of motion vision neural systems for small target motion detection.

University of Lincoln researcher Hongxin Wang recently published a paper titled “A Time-Delay Feedback Neural Network for Discriminating Small, Fast-Moving Targets in Complex Dynamic Environments” on IEEE Transactions on Neural Networks and Learning Systems. IEEE Transactions on Neural Networks and Learning Systems is one of top-tier journals that publish technical articles dealing with the theory, design, and applications of neural networks and related learning systems. It has a significant influence on artificial neural networks and learning systems.

Examples of small moving targets
Fig. 1. (a) on the left and (b) on the right. Examples of small moving targets. (a) A unmanned aerial vehicle (UAV), and (b) a bird in the distance where their surrounding regions are enlarged in the red boxes. Both the UAV and bird appear as dim speckles with only a few pixels in size where most of visual features are difficult to discern. In particular, they all show extremely low contrast against the complex background.

Monitoring moving objects against complex natural backgrounds is a huge challenge to future robotic vision systems, let alone detecting small targets with only one or a few pixels in size, for example, an unmanned aerial vehicle (UAV) or a bird in the distance, as shown in Fig. 1.

Traditional motion detection methods, such as optical flow, background subtraction, and temporal differencing, perform well on large objects which permit visualization with a high degree of resolution, and which present a clear appearance and structure, such as pedestrians, bikes, and vehicles. However, such methods are ineffective against targets as small as a few pixels. This is because visual features, such as texture, color, shape, and orientation, are difficult to determine in such small objects and cannot be used for motion detection. Effective solutions to detect small target motion against cluttered moving backgrounds on natural images are still rare.

Research in the field of visual neuroscience has contributed toward the design of artificial visual systems for small target detection. As a result of millions of years of evolution, insects have developed accurate, efficient, and robust capabilities for the detection of small moving targets. The exquisite sensitivity of insects for small target motion is coming from a class of specific neurons called small target motion detectors (STMDs). Building a quantitative STMD model is the first step for not only further understanding of the biological visual system but also providing robust and economic solutions of small target detection for an artificial vision system.

In this article, we propose an STMD-based model with time-delay feedback (feedback STMD) and demonstrate its critical role in detecting small targets against cluttered backgrounds. We have conducted systematic analysis as well as extensive experiments. The results show that the feedback STMD largely suppresses slow-moving background false-positives, whilst retaining the ability to respond to small targets with higher velocities. The behavior of the developed feedback model is consistent with that of the animal visual systems in which high-velocity objects always receive more attention. Furthermore, it also enables autonomous robots to effectively discriminate potentially threatening fast-moving small targets from complex backgrounds, a feature required, for example, in surveillance.

Nikolas Andreakos Presents Paper in 30TH Annual Computational Neuroscience Meeting (CNS*2021)

Nikolas Andreakos is a PhD candidate at the University of Lincoln, who is working on developing computational models of associative memory formation and recognition in the mammalian hippocampus.

Recently Nikolas attended the 30th Annual Computational Neuroscience Meeting (CNS*2021). Due to the current travel restrictions, this year’s conference was moved online from 3rd to 7th of July 2021.

CNS 2021 online conference image

The purpose of the Organization for Computational Neurosciences is to create a scientific and educational forum for students, scientists, other professionals, and the general public to learn about, to share, contribute to, and advance the state of knowledge in computational neuroscience.

Computational neuroscience combines mathematical analyses and computer simulations with experimental neuroscience, to develop a principled understanding of the workings of nervous systems and apply it in a wide range of technologies.

The Organization for Computational Neurosciences promotes meetings and courses in computational neuroscience and organizes the Annual CNS Meeting which serves as a forum for young scientists to present their work and to interact with senior leaders in the field.

Poster Presentation

Nikolas presented his research Modelling the effects of perforant path in the recall performance of a CA1 microcircuit with excitatory and inhibitory neurons.

CNS 2021 online conference poster
Nikolas Andreakos CNS 2021 poster

Abstract

From recollecting childhood memories to recalling if we turn off the oven before we left the house, memory defines who we are. Losing it can be very harmful to our survival. Recently we quantitatively investigated the biophysical mechanisms leading to memory recall improvement of a computational CA1 microcircuit model of the hippocampus [1]. In the present study, we investigated the synergistic effects of the EC excitatory input (sensory input) and the CA3 excitatory input (contextual information) on the recall performance of the CA1 microcircuit. Our results showed that when the EC input was exactly the same as the CA3 input then the recall performance of our model was strengthened. When the two inputs were dissimilar (degree similarity: 40% – 0%), then the recall performance was reduced. These results were positively correlated with how many “active cells” represented a memory pattern. When the number of active cells increased and the degree of similarity between the two inputs decreased, then the recall performance of the model was reduced. The latter finding confirms previous results of ours where the number of cells coding a piece of information plays a significant role in the recall performance of our model.

References
1. Andreakos, N., Yue, S. & Cutsuridis, V. Quantitative investigation of memory recall performance of a computational microcircuit model of the hippocampus. Brain Inf 8, 9 (2021). https://doi.org/10.1186/s40708-021-00131-7

Nikolas Andreakos CNS 2021 poster presentation
Nikolas Andreakos CNS 2021 poster presentation

Dr Shyamala Doraisamy Participates in Euraxess Events

Dr Shyamala Doraisamy, Universiti Putra Malaysia’s (UPM) lead for the STEP2DYNA and ULTRACEPT consortia, participated in two Euraxess Asean’s events in 2020 and 2021.

Dr Doraisamy was invited as a guest speaker at the “Best Practices” breakout sessions of the “MSCA – Staff Exchange (MSCA-SE) – International collaboration with European partners” Webinar on 15 June 2021. The European Commission’s Marie Skłodowska-Curie Actions (MSCA) offers funding for short-term international and inter-sectoral exchanges of staff members involved in research and innovation activities of participating organisations.

The webinar introduced the MSCA Actions with a specific focus on the 2021 call “Staff Exchanges” (MSCA-SE) which promotes international and cross-sector collaboration through exchanging research and innovation staff.  How participants could enhance their organisation’s innovation capacity through interdisciplinary and intersectoral collaboration with European and global partners were discussed during the webinar.

An introduction was given by the following speakers:

Ms Marlène Bartes, Policy Officer, Directorate General for Education, Culture, Youth and Sport (DG EAC), European Commission, Brussels, Belgium

Mr Brito Ferreira, Head of Sector, European Research Executive Agency, European Commission, Brussels, Belgium

Amongst the information provided were:

  • What are the MSCA Staff Exchanges?
  • Who can apply for MSCA Staff Exchanges?
  • What is funded?
  • How does it work?
  • When is the next call for propoals?

Following the introduction, breakout sessions led by ASEAN and European participants in ongoing MSCA-funded research consortia. The breakout sessions allowed a platform for exchange on how to create a MSCA-SE consortium.

Dr Shyamala Doraisamy, Universiti Putra Malaysia, and partner in MSCA Research & Innovation Staff Exchange (RISE) project ‘Ultra-layered perception with brain-inspired information processing for vehicle collision avoidance (ULTRACEPT)‘ was invited to speak at this session.

Dr Shyamala Doraisamy Participates in Euraxess Events
Dr Shyamala Doraisamy presenting at the Euraxess Webinar

Dr Doraisamy also participated in the European Research Day (ERD) 2020 (Virtual Edition) organized by Euraxess Asean.  ERD Malaysia 2020 was held from 21- 25 September 2020 to promote research collaboration with Europe and career advancement opportunities for researchers in Malaysia and ASEAN.

Dr Shyamala Doraisamy Participates in Euraxess Events

Following the opening remarks by Francesco Floris, Head of the Trade and Economic Section of the European Union Delegation to Malaysia. Dr Shyamala Doraisamy was invited as a panelist for an international collaboration panel discussion.  This dialogue session on ‘How to build and maintain an international research network during a pandemic” was held in collaboration with the Young Scientist Network (YSN) of Malaysia with 211 participants on 21 September 2020.

Dr Shyamala Doraisamy Participates in Euraxess Events

 

UBA Host Sandpit session: Optic flow analysis: From the circuit to the behaviour and back

To aid and support the continued collaboration and knowledge exchange of the ULTRACEPT researchers, the consortium hosts online quarterly ‘Sandpit Sessions’. The aim of these sessions is to provide researchers an opportunity to share their work in an informal forum where they can raise and discuss issues and challenges in order to gain support and feedback from the group.

The project group’s fourth sandpit session was hosted online by ULTRACEPT partner the University of Buenos Aires (UBA-CONICET) on the 23rd July 2021. The session was facilitated by PhD student Yair Barnatan, and Dr Julieta Sztarker, Independent researcher (chair). The theme of the session was ‘Optic flow analysis: From the circuit to the behaviour and back’.

Presentation by Dr Julieta Sztarker
Presentation by Dr Julieta Sztarker
Presentation by Dr Julieta Sztarker
Presentation by Dr Julieta Sztarker

Sandpit Session 4: Optic flow analysis: From the circuit to the behaviour and back

  • Date: Friday, 23rd July 2021
  • Time: UK 11:00; China 18:00; Germany 12:00; Argentina 07:00; Malaysia 18:00; Japan 19:00.
  • Facilitators: Yair Barnatan, PhD student, organisation UBA-CONICET, Julieta Sztarker, Independent researcher UBA-CONICET (chair)
  • Location: MS Teams
Sandpit Schedule
UK Time Item Presenter/s
11:00-11:05 Arrival and welcome Julieta Sztarker
11:05-11:35 Optic flow analysis: From the circuit to the behaviour and back

 

In this talk I will summarise what we know about the optomotor responses performed by flies and crabs in different conditions (monocular, binocular stimulation) and using different directions of stimulation. I will present the underlying neuronal circuit that has been proposed for flies based on a long line of electrophysiological recordings of directional neurons from the lobula plate and other cells and behavioural studies. I will present results on the locomotive optomotor responses in crabs that respond preferably to stimulation in a unique direction under monocular conditions but not in binocular ones and show what we know so far about the type of directional cells found in crabs.

Julieta Sztarker
11:35-12:00 Out of the oven: recent results using simultaneous recordings of locomotive and eye saccades in crabs

I will present our latest data measuring simultaneously the locomotive and eye saccades of crabs in response to panoramic stimulus moving in different directions and conditions (monocular, binocular). We found an apparent correlation between the variables obtained in some conditions of stimulation but not in others.

Yair Barnatan
12:00-12:25 Group discussion about the possibility of modelling the underlying circuit based on behavioural data. Julieta Sztarker
12:25-12:30 Final comments & volunteer for a facilitator for the next session

We are planning our next sandpit session in 2 months’ time (September).

Julieta Sztarker
Presentation by Yair Barnatan
Presentation by Yair Barnatan
Presentation by Yair Barnatan
Presentation by Yair Barnatan

ULTRACEPT Researchers Present at IEEE ICRA 2021

The 2021 International Conference on Robotics and Automation (IEEE ICRA 2021) was held in Xi’an, China from 31st May to 4th June 2021. As one of the premier and top conferences in the field of robotics and automation, this great event has gathered thousands of excellent researchers from all over the world. Due to the pandemic, the conference was held in a hybrid format, including physical on-site and virtual cloud meetings. Four ULTRACEPT researchers attended this event, 3 in person and 1 online.

Proactive Action Visual Residual Reinforcement Learning for Contact-Rich Tasks Using a Torque-Controlled Robot

Yunlei Shi: Proactive Action Visual Residual Reinforcement Learning for Contact-Rich Tasks Using a Torque-Controlled Robot
Yunlei Shi: Proactive Action Visual Residual Reinforcement Learning for Contact-Rich Tasks Using a Torque-Controlled Robot

Agile Robots researcher Yunlei Shi attended ICRA 2021 online and presented his paper ‘Proactive Action Visual Residual Reinforcement Learning for Contact-Rich Tasks Using a Torque-Controlled Robot’.

Yunlei Shi is a full-time Ph.D. student at the Universität Hamburg and working at project partner Agile Robots, contributing to ULTRACEPT’s Work Package 4. In 2020 he visited Tsinghua University as part of the STEP2DYNA project.

Yunlei Shi presenting at ICRA 2020
Yunlei Shi presenting online at ICRA 2020

Yunlei presented his conference paper:

Yunlei Shi, Zhaopeng Chen, Hongxu Liu, Sebastian Riedel, Chunhui Gao, Qian Feng, Jun Deng, Jianwei Zhang, “Proactive Action Visual Residual Reinforcement Learning for Contact-Rich Tasks Using a Torque-Controlled Robot”, (ICRA) 2021, Xi’ an, China.

Abstract

Contact-rich manipulation tasks are commonly found in modern manufacturing settings. However, manually designing a robot controller is considered hard for traditional control methods as the controller requires an effective combination of modalities and vastly different characteristics. In this paper, we first consider incorporating operational space visual and haptic information into a reinforcement learning (RL) method to solve the target uncertainty problems in unstructured environments. Moreover, we propose a novel idea of introducing a proactive action to solve a partially observable Markov decision process (POMDP) problem. With these two ideas, our method can either adapt to reasonable variations in unstructured environments or improve the sample efficiency of policy learning. We evaluated our method on a task that involved inserting a random-access memory (RAM) using a torque-controlled robot and tested the success rates of different baselines used in the traditional methods. We proved that our method is robust and can tolerate environmental variations.

Representation of policies and controller scheme. The blue region is the real-time controller, and the wheat region is the non-real-time trained policy.
Representation of policies and controller scheme. The blue region is the real-time controller, and the wheat region is the non-real-time trained policy.

More details about this paper can be viewed in this video on the Universität Hamburg’s Technical Aspects of Multimodal Systems (TAMS) YouTube channel.

Yunlei was very happy to attend this fantastic conference with support from the project ULTRACEPT.

A Versatile Vision-Pheromone-Communication Platform for Swarm Robotics

Three researchers from the University of Lincoln; Tian Liu, Xuelong Sun, and Qinbing Fu, attended ICRA 2021 in person to present their co-authored paper, ‘A Versatile Vision-Pheromone-Communication Platform for Swarm Robotics’. 

ULTRACEPT researchers Tian Liu, Xuelong Sun and Qinbing Fu attending ICRA 2021
ULTRACEPT researchers Tian Liu, Xuelong Sun and Qinbing Fu attending ICRA 2021

We three were very happy to physically attend this fantastic conference with the support from the project ULTRACEPT.

We have one co-authored paper that presents our developed vision-pheromone-communication platform which was published in the proceedings of this conference. Tian Liu delivered the presentation which outlined our platform and it attracted some attention of attendees through interesting questions asked by the audience. We think that this event has provided us a great opportunity to raise publicity about our platform for future swarm robotics and social insects studies.

Tian Liu presenting at ICRA 2021
Tian Liu presenting at ICRA 2021

A Versatile Vision-Pheromone-Communication Platform for Swarm Robotics, Tian Liu, Xuelong Sun, Cheng Hu, Qinbing Fu, and Shigang Yue, University of Lincoln

Keywords: Biologically-Inspired Robots, Multi-Robot Systems, Swarm Robotics

Abstract: This paper describes a versatile platform for swarm robotics research. It integrates multiple pheromone communication with a dynamic visual scene along with real-time data transmission and localization of multiple-robots. The platform has been built for inquiries into social insect behavior and bio-robotics. By introducing a new research scheme to coordinate olfactory and visual cues, it not only complements current swarm robotics platforms which focus only on pheromone communications by adding visual interaction but also may fill an important gap in closing the loop from bio-robotics to neuroscience. We have built a controllable dynamic visual environment based on our previously developed ColCOSPhi (a multi-pheromones platform) by enclosing the arena with LED panels and interacting with the micro mobile robots with a visual sensor. In addition, a wireless communication system has been developed to allow the transmission of real-time bi-directional data between multiple micro robot agents and a PC host. A case study combining concepts from the internet of vehicles (IoV) and insect-vision inspired model has been undertaken to verify the applicability of the presented platform and to investigate how complex scenarios can be facilitated by making use of this platform.

We have grasped many interesting ideas and inspirations from colleagues in the robotics field from not only the excellent talks but also high-quality robots’ exhibitions from famed companies in the industry.

Conference presentations at ICRA 2021
Conference presentations attended by the researchers at ICRA 2021
Demonstration at the ICRA 2021 conference
Demonstration at the ICRA 2021 conference

On the last day of the conference, we attended a wonderful tour of the Shaanxi History Museum and the Terra-Cotta Warriors, from which we have leaned a lot about the impressive history and culture of Qin dynasty. Further, this also makes us rethink the important role played by science and technology in assisting archaeological excavation and cultural relic protection.

Thanks to the supportive ULTRACEPT project, we really enjoyed the whole event bringing us not only new knowledge about the robotics and history, but enlightening inspirations which will potentially motivate our future researches. In addition, our group’s researching works also have been propagated via this top international conference.

What we can learn from insects: Unveiling insect navigation mechanism

To aid and support the continued collaboration and knowledge exchange of the ULTRACEPT researchers, the consortium hosts online quarterly ‘Sandpit Sessions’. The aim of these sessions is to provide researchers an opportunity to share their work in an informal forum where they can raise and discuss issues and challenges in order to gain support and feedback from the group.

Researchers Xuelong Sun, a student of the University of Lincoln, and Dr Qingbing Fu, Postdoc at Guangzhou University, recently hosted an online Sandpit Session on the 14h May 2021. The theme of the session was ‘What we can learn from insect: Unveil insect navigation mechanism’.

What can we learn from insect: Unveil insect navigation mechanism

Sandpit Session 3:  What we can learn from insects: Unveil insect navigation mechanism

  • Date: Friday, 14th May 2021
  • Time: UK 10:00; China 17:00; Germany 11:00; Argentina 06:00; Malaysia 17:00; Japan 18:00.
  • Facilitators: Xuelong Sun, PhD student, University of Lincoln, Qinbing Fu, Postdoc, Guangzhou University (chair)
  • Location: MS Teams
Sandpit Schedule
UK Time Item Presenters
10:00-10:05 Arrival and welcome Qinbing Fu,

Xuelong Sun

10:05-10:35 Insect navigation

Many insects are highly capable navigators, with abilities that rival those of mammals and other vertebrates. I will give a review of insect navigation from the following three aspects: 1) the rich array of insect navigation behaviours, 2) the known brain regions and neuropils related to navigation tasks and 3) computation models aiming to unravel the neural mechanism of insect navigation. Then, from the computation model point of view, I will report our work filling the current gaps of understanding insect navigation especially the visual navigation and optimal cue integration. Thus, the potentially useful role that computation model plays in understanding biology system will be demonstrated, which closes this session and opens the topic to be discussed in next session.

Xuelong Sun
10:35-11:10 Group discussion about the session topic:

What’s the role of computation model and biorobotics in understanding biology system? 

A group discussion where attendees can raise questions and discuss the topic of research and potential cooperation that was presented.

Facilitated by Qinbing Fu and Xuelong Sun
11.10-11.25 Open forum discussion

An opportunity for attendees to ask the group for advice regarding any challenges they are facing with their own research.

Facilitated by Qinbing Fu and Xuelong Sun
11:25-11:30 Final comments & volunteer for a facilitator for the next session

We are planning our next sandpit session for July 2021.

Xuelong Sun

What can we learn from insect: Unveil insect navigation mechanism

You can learn more about Xuelong’s research on his post about his 12 month ULTRACEPT secondment to Guangzhou University.

Fang Lei completes 12 month secondment at Guangzhou University, China

Fang Lei enrolled as a PhD Scholar at the University of Lincoln in 2019. In early 2020 she visited Guangzhou University as part of the STEP2DYNA project funded by the European Union’s Horizon 2020 Research and Innovation Programme under the Marie Skolodowska-Curie grant agreement. During this secondment Fang Lei was working on developing bio-inspired visual systems for collision detection in dim light environments. More recently, Fang continued this work during her 12 month secondment at Guangzhou University under ULTRACEPT from May 2020 to 2021.

During the secondment to Guangzhou University, I was working on developing bio-inspired visual systems for collision detection in dim light environments. For the autonomous navigation of vehicles or robots, it is a challenging task to detect moving objects in extremely low-light conditions due to very low signal-to-noise ratios (SNRs). However, nocturnal insects possess remarkable visual abilities in perceiving motion cues and detecting moving objects in very dim light environments. There are many studies on the night vision of insects’ visual systems, which provide us with a lot of inspirations for enhancing motion cues and modelling an artificial visual system to detect motion like looming objects. Fig. 1 shows an example image of looming motion in a dim light environment which is from the low-light video motion (LLVM) dataset obtained by the experimental devices (see Fig. 2).

Fig. 1 An example image of looming motion
Fig. 1 An example image of looming motion
Fig. 2 Experimental devices
Fig. 2 Experimental devices

In order to develop more ideas and experiences in my modelling work, I discussed this with other colleagues and Prof. Peng (see Fig. 3) and got very useful suggestions. We discussed mainly the biological modelling of direction selectivity of LGMD1. We also organized a group seminar every week to discuss the related problems we encounter in our research projects, and I also got a lot of valuable experiences on bio-inspired modelling by sharing our ideas.

Fang with Prof. Peng and colleagues at Guangzhou
Fang with Prof. Peng and colleagues at Guangzhou

For my research work, collision detection in a dim light environment includes the modelling work of direction selectivity of LGMD1 neuron and the motion cues enhancement. I have developed the new LGMD1 model which is effective in distinguishing looming motion from translating motion. I have published one conference paper and attended the online virtual conference (IJCNN 2021, see Fig. 4). I also submitted one journal paper to IEEE transactions on neural networks and learning systems (NNLS) which is under review. Additionally, I have finished the modelling work of motion cues enhancement and proposed a new model. Fig. 5 shows the enhancement results of the captured dark image sequences during testing experiments.

Fig. 4 Online virtual conference of IJCNN2021
Fig. 4 Online virtual conference of IJCNN2021
Fig.5 Testing captured dark image sequences and the experimental results
Fig.5 Testing captured dark image sequences and the experimental results

During this 12-month secondment, I have a better knowledge of bio-inspired modelling and obtain a lot of exercises of connection between theory and practice.  I established good friendships with my colleagues through frequent communications in every week’s group seminar, which provide a basis for future cooperation. The secondment was a very precious experience for me. Many thanks to ULTRACEPT project for supporting my research work and providing me with the opportunity to work together with my colleagues.

Fang Lei at GZHU
Fang Lei at GZHU

Hongxin Wang Completes 12 Month Secondment at Guangzhou University

Hongxin Wang received his PhD in computer science from the University of Lincoln in 2020. Following a secondment under the STEP2DYNA project, Dr Wang carried out a further secondment under the ULTRACEPT project from April 2020 to April 2021 at partner Guangzhou University. Here, he undertook research contributing to work packages 1 and 2. Dr Wang’s ULTRACEPT contributions have involved directing the research into computational modelling of motion vision neural systems for small target motion detection. 

University of Lincoln’s Experienced Researcher Dr Hongxin Wang recently completed a 12 month secondment at ULTRACEPT project partner Guangzhou University in China. The project is funded by the European Union’s Horizon 2020 Research and Innovation Program under the Marie Skolodowska-Curie grant agreement. Dr Wang visited Guangzhou from April 2020 to April 2021 and contributed to Work Package 1 and 2.

Dr Wang reflects on what he has achieved during secondment

Monitoring moving objects against complex natural backgrounds is a huge challenge to future robotic vision systems, and even more so when attempting to detect small targets only a few pixels in size, for example, an unmanned aerial vehicle (UAV) or a bird in the distance, as shown in Fig. 1. Surprisingly, insects are quite apt at searching for mates and tracking prey, which appears as small dim speckles in the visual field. The exquisite sensitivity of insects for small target motion comes from a class of specific neurons called small target motion detectors (STMDs). Building a quantitative STMD model is the first step for not only further understanding the biological visual system but also providing robust and economic solutions of small target detection for an artificial vision system.

Fig. 1. Examples of small moving targets. (a) A unmanned aerial vehicle (UAV), and (b) a bird in the distance where their surrounding regions are enlarged in the red boxes. Both the UAV and bird appear as dim speckles with only a few pixels in size where most of visual features are difficult to discern. In particular, they all show extremely low contrast against the complex background.

During this twelve-month secondment, I continued my previous work on modeling insects’ visual systems for small target detection and have made great progress. Specifically, we proposed a STMD-based model with time-delay feedback to achieve superior detection performance for fast-moving small targets, whilst significantly suppressing background false positive movements which display lower velocities. This work has been submitted to IEEE Transactions on Neural Networks and Learning Systems and is currently under review. In addition, we developed an attention-prediction guided visual system to overcome the heavy dependency of the existing models on target contrast to background, as illustrated in Fig. 2. The paper presenting this work has been completed and will be submitted to IEEE Transactions on Cybernetics.

Fig. 2. Overall flowchart of the proposed attention and prediction guided visual system. It consists of a preprocessing module (left), an attention module (top), a STMD-based neural network (right), a prediction module (bottom), and a memorizer (middle).

During my 12 month secondment at Guangzhou University, I obtained inspiration and mathematical theory support from Professor Jigen Peng to design the STMD-based visual systems. We organized a seminar every week to discuss the latest biological findings, explore effective neural modeling methods, and develop specialised mathematical theory for bioinspired motion detection. Significant progress was made under the help of Professor Jigen Peng.

Hongxin Wang on secondment at Guangzhou University
Hongxin Wang on secondment at Guangzhou University

The secondment has also provided me with an opportunity to improve my mathematical ability with support from Professor Peng. Strong mathematical ability helps me better describe the insects’ visual systems, and build robust neural models for small target motion detection. In addition, I established a deep friendship with Professor Peng and my colleagues at Guangzhou University, which is providing me a basis for future research collaborations. Lastly, I introduced our research to colleagues during the discussion, which may attract their attention to our research field and finally boost the development of neural system modelling.

The secondment has been an excellent experience for me and provided me the opportunity to collaborate with my project colleagues. Thank you for the support from the ULTRACEPT project which benefited me a lot.

ULTRACEPT Annual Board Meeting March 2021

The ULTRACEPT annual board meeting was hosted by ULTRACEPT partner Guangzhou University (GZHU). Due to the travel restrictions caused by COVID-19, the workshop was held in person by the research group at GZHU and as an online event using MS Teams on Thursday 25th March 2021. The meeting was combined with the ULTRACEPT Workshop 3 event. Attendees included students, researchers, academic staff, partner leads, and the project manager. The EU Project Officer Irina Tiron joined the group for the board meeting component of the event.

ULTRACEPT annual board meeting
ULTRACEPT annual board meeting

The mid-term meeting provided partners with an opportunity to engage in a fruitful and constructive dialogue between the consortium and the Research Executive Agency. The impact of COVID-19 on the project was also discussed as well as strategies to overcome the impact of the ongoing travel restrictions on the secondments. Thus far, the consortium has been able to progress the work packages by working remotely during lockdowns and also keeping in touch and collaborating via online platforms.

ULTRACEPT workshop 3
GZHU researchers attending the ULTRACEPT meeting