Category Archives: NEWS

A Time-Delay Feedback Neural Network for Discriminating Small, Fast-Moving Targets in Complex Dynamic Environments

Hongxin Wang received his PhD degree in computer science from the University of Lincoln, UK, in 2020. Following a secondment under the STEP2DYNA project, Dr Wang carried out a further secondment under the ULTRACEPT project from April 2020 to April 2021 at partner Guangzhou University where he undertook research contributing to work packages 1 and 2. Dr Wang’s ULTRACEPT contributions have involved directing the research into computational modelling of motion vision neural systems for small target motion detection.

University of Lincoln researcher Hongxin Wang recently published a paper titled “A Time-Delay Feedback Neural Network for Discriminating Small, Fast-Moving Targets in Complex Dynamic Environments” on IEEE Transactions on Neural Networks and Learning Systems. IEEE Transactions on Neural Networks and Learning Systems is one of top-tier journals that publish technical articles dealing with the theory, design, and applications of neural networks and related learning systems. It has a significant influence on artificial neural networks and learning systems.

Examples of small moving targets
Fig. 1. (a) on the left and (b) on the right. Examples of small moving targets. (a) A unmanned aerial vehicle (UAV), and (b) a bird in the distance where their surrounding regions are enlarged in the red boxes. Both the UAV and bird appear as dim speckles with only a few pixels in size where most of visual features are difficult to discern. In particular, they all show extremely low contrast against the complex background.

Monitoring moving objects against complex natural backgrounds is a huge challenge to future robotic vision systems, let alone detecting small targets with only one or a few pixels in size, for example, an unmanned aerial vehicle (UAV) or a bird in the distance, as shown in Fig. 1.

Traditional motion detection methods, such as optical flow, background subtraction, and temporal differencing, perform well on large objects which permit visualization with a high degree of resolution, and which present a clear appearance and structure, such as pedestrians, bikes, and vehicles. However, such methods are ineffective against targets as small as a few pixels. This is because visual features, such as texture, color, shape, and orientation, are difficult to determine in such small objects and cannot be used for motion detection. Effective solutions to detect small target motion against cluttered moving backgrounds on natural images are still rare.

Research in the field of visual neuroscience has contributed toward the design of artificial visual systems for small target detection. As a result of millions of years of evolution, insects have developed accurate, efficient, and robust capabilities for the detection of small moving targets. The exquisite sensitivity of insects for small target motion is coming from a class of specific neurons called small target motion detectors (STMDs). Building a quantitative STMD model is the first step for not only further understanding of the biological visual system but also providing robust and economic solutions of small target detection for an artificial vision system.

In this article, we propose an STMD-based model with time-delay feedback (feedback STMD) and demonstrate its critical role in detecting small targets against cluttered backgrounds. We have conducted systematic analysis as well as extensive experiments. The results show that the feedback STMD largely suppresses slow-moving background false-positives, whilst retaining the ability to respond to small targets with higher velocities. The behavior of the developed feedback model is consistent with that of the animal visual systems in which high-velocity objects always receive more attention. Furthermore, it also enables autonomous robots to effectively discriminate potentially threatening fast-moving small targets from complex backgrounds, a feature required, for example, in surveillance.

Nikolas Andreakos Presents Paper in 30TH Annual Computational Neuroscience Meeting (CNS*2021)

Nikolas Andreakos is a PhD candidate at the University of Lincoln, who is working on developing computational models of associative memory formation and recognition in the mammalian hippocampus.

Recently Nikolas attended the 30th Annual Computational Neuroscience Meeting (CNS*2021). Due to the current travel restrictions, this year’s conference was moved online from 3rd to 7th of July 2021.

CNS 2021 online conference image

The purpose of the Organization for Computational Neurosciences is to create a scientific and educational forum for students, scientists, other professionals, and the general public to learn about, to share, contribute to, and advance the state of knowledge in computational neuroscience.

Computational neuroscience combines mathematical analyses and computer simulations with experimental neuroscience, to develop a principled understanding of the workings of nervous systems and apply it in a wide range of technologies.

The Organization for Computational Neurosciences promotes meetings and courses in computational neuroscience and organizes the Annual CNS Meeting which serves as a forum for young scientists to present their work and to interact with senior leaders in the field.

Poster Presentation

Nikolas presented his research Modelling the effects of perforant path in the recall performance of a CA1 microcircuit with excitatory and inhibitory neurons.

CNS 2021 online conference poster
Nikolas Andreakos CNS 2021 poster

Abstract

From recollecting childhood memories to recalling if we turn off the oven before we left the house, memory defines who we are. Losing it can be very harmful to our survival. Recently we quantitatively investigated the biophysical mechanisms leading to memory recall improvement of a computational CA1 microcircuit model of the hippocampus [1]. In the present study, we investigated the synergistic effects of the EC excitatory input (sensory input) and the CA3 excitatory input (contextual information) on the recall performance of the CA1 microcircuit. Our results showed that when the EC input was exactly the same as the CA3 input then the recall performance of our model was strengthened. When the two inputs were dissimilar (degree similarity: 40% – 0%), then the recall performance was reduced. These results were positively correlated with how many “active cells” represented a memory pattern. When the number of active cells increased and the degree of similarity between the two inputs decreased, then the recall performance of the model was reduced. The latter finding confirms previous results of ours where the number of cells coding a piece of information plays a significant role in the recall performance of our model.

References
1. Andreakos, N., Yue, S. & Cutsuridis, V. Quantitative investigation of memory recall performance of a computational microcircuit model of the hippocampus. Brain Inf 8, 9 (2021). https://doi.org/10.1186/s40708-021-00131-7

Nikolas Andreakos CNS 2021 poster presentation
Nikolas Andreakos CNS 2021 poster presentation

Dr Shyamala Doraisamy Participates in Euraxess Events

Dr Shyamala Doraisamy, Universiti Putra Malaysia’s (UPM) lead for the STEP2DYNA and ULTRACEPT consortia, participated in two Euraxess Asean’s events in 2020 and 2021.

Dr Doraisamy was invited as a guest speaker at the “Best Practices” breakout sessions of the “MSCA – Staff Exchange (MSCA-SE) – International collaboration with European partners” Webinar on 15 June 2021. The European Commission’s Marie Skłodowska-Curie Actions (MSCA) offers funding for short-term international and inter-sectoral exchanges of staff members involved in research and innovation activities of participating organisations.

The webinar introduced the MSCA Actions with a specific focus on the 2021 call “Staff Exchanges” (MSCA-SE) which promotes international and cross-sector collaboration through exchanging research and innovation staff.  How participants could enhance their organisation’s innovation capacity through interdisciplinary and intersectoral collaboration with European and global partners were discussed during the webinar.

An introduction was given by the following speakers:

Ms Marlène Bartes, Policy Officer, Directorate General for Education, Culture, Youth and Sport (DG EAC), European Commission, Brussels, Belgium

Mr Brito Ferreira, Head of Sector, European Research Executive Agency, European Commission, Brussels, Belgium

Amongst the information provided were:

  • What are the MSCA Staff Exchanges?
  • Who can apply for MSCA Staff Exchanges?
  • What is funded?
  • How does it work?
  • When is the next call for propoals?

Following the introduction, breakout sessions led by ASEAN and European participants in ongoing MSCA-funded research consortia. The breakout sessions allowed a platform for exchange on how to create a MSCA-SE consortium.

Dr Shyamala Doraisamy, Universiti Putra Malaysia, and partner in MSCA Research & Innovation Staff Exchange (RISE) project ‘Ultra-layered perception with brain-inspired information processing for vehicle collision avoidance (ULTRACEPT)‘ was invited to speak at this session.

Dr Shyamala Doraisamy Participates in Euraxess Events
Dr Shyamala Doraisamy presenting at the Euraxess Webinar

Dr Doraisamy also participated in the European Research Day (ERD) 2020 (Virtual Edition) organized by Euraxess Asean.  ERD Malaysia 2020 was held from 21- 25 September 2020 to promote research collaboration with Europe and career advancement opportunities for researchers in Malaysia and ASEAN.

Dr Shyamala Doraisamy Participates in Euraxess Events

Following the opening remarks by Francesco Floris, Head of the Trade and Economic Section of the European Union Delegation to Malaysia. Dr Shyamala Doraisamy was invited as a panelist for an international collaboration panel discussion.  This dialogue session on ‘How to build and maintain an international research network during a pandemic” was held in collaboration with the Young Scientist Network (YSN) of Malaysia with 211 participants on 21 September 2020.

Dr Shyamala Doraisamy Participates in Euraxess Events

 

UBA Host Sandpit session: Optic flow analysis: From the circuit to the behaviour and back

To aid and support the continued collaboration and knowledge exchange of the ULTRACEPT researchers, the consortium hosts online quarterly ‘Sandpit Sessions’. The aim of these sessions is to provide researchers an opportunity to share their work in an informal forum where they can raise and discuss issues and challenges in order to gain support and feedback from the group.

The project group’s fourth sandpit session was hosted online by ULTRACEPT partner the University of Buenos Aires (UBA-CONICET) on the 23rd July 2021. The session was facilitated by PhD student Yair Barnatan, and Dr Julieta Sztarker, Independent researcher (chair). The theme of the session was ‘Optic flow analysis: From the circuit to the behaviour and back’.

Presentation by Dr Julieta Sztarker
Presentation by Dr Julieta Sztarker
Presentation by Dr Julieta Sztarker
Presentation by Dr Julieta Sztarker

Sandpit Session 4: Optic flow analysis: From the circuit to the behaviour and back

  • Date: Friday, 23rd July 2021
  • Time: UK 11:00; China 18:00; Germany 12:00; Argentina 07:00; Malaysia 18:00; Japan 19:00.
  • Facilitators: Yair Barnatan, PhD student, organisation UBA-CONICET, Julieta Sztarker, Independent researcher UBA-CONICET (chair)
  • Location: MS Teams
Sandpit Schedule
UK Time Item Presenter/s
11:00-11:05 Arrival and welcome Julieta Sztarker
11:05-11:35 Optic flow analysis: From the circuit to the behaviour and back

 

In this talk I will summarise what we know about the optomotor responses performed by flies and crabs in different conditions (monocular, binocular stimulation) and using different directions of stimulation. I will present the underlying neuronal circuit that has been proposed for flies based on a long line of electrophysiological recordings of directional neurons from the lobula plate and other cells and behavioural studies. I will present results on the locomotive optomotor responses in crabs that respond preferably to stimulation in a unique direction under monocular conditions but not in binocular ones and show what we know so far about the type of directional cells found in crabs.

Julieta Sztarker
11:35-12:00 Out of the oven: recent results using simultaneous recordings of locomotive and eye saccades in crabs

I will present our latest data measuring simultaneously the locomotive and eye saccades of crabs in response to panoramic stimulus moving in different directions and conditions (monocular, binocular). We found an apparent correlation between the variables obtained in some conditions of stimulation but not in others.

Yair Barnatan
12:00-12:25 Group discussion about the possibility of modelling the underlying circuit based on behavioural data. Julieta Sztarker
12:25-12:30 Final comments & volunteer for a facilitator for the next session

We are planning our next sandpit session in 2 months’ time (September).

Julieta Sztarker
Presentation by Yair Barnatan
Presentation by Yair Barnatan
Presentation by Yair Barnatan
Presentation by Yair Barnatan

ULTRACEPT Researchers Present at IEEE ICRA 2021

The 2021 International Conference on Robotics and Automation (IEEE ICRA 2021) was held in Xi’an, China from 31st May to 4th June 2021. As one of the premier and top conferences in the field of robotics and automation, this great event has gathered thousands of excellent researchers from all over the world. Due to the pandemic, the conference was held in a hybrid format, including physical on-site and virtual cloud meetings. Four ULTRACEPT researchers attended this event, 3 in person and 1 online.

Proactive Action Visual Residual Reinforcement Learning for Contact-Rich Tasks Using a Torque-Controlled Robot

Yunlei Shi: Proactive Action Visual Residual Reinforcement Learning for Contact-Rich Tasks Using a Torque-Controlled Robot
Yunlei Shi: Proactive Action Visual Residual Reinforcement Learning for Contact-Rich Tasks Using a Torque-Controlled Robot

Agile Robots researcher Yunlei Shi attended ICRA 2021 online and presented his paper ‘Proactive Action Visual Residual Reinforcement Learning for Contact-Rich Tasks Using a Torque-Controlled Robot’.

Yunlei Shi is a full-time Ph.D. student at the Universität Hamburg and working at project partner Agile Robots, contributing to ULTRACEPT’s Work Package 4. In 2020 he visited Tsinghua University as part of the STEP2DYNA project.

Yunlei Shi presenting at ICRA 2020
Yunlei Shi presenting online at ICRA 2020

Yunlei presented his conference paper:

Yunlei Shi, Zhaopeng Chen, Hongxu Liu, Sebastian Riedel, Chunhui Gao, Qian Feng, Jun Deng, Jianwei Zhang, “Proactive Action Visual Residual Reinforcement Learning for Contact-Rich Tasks Using a Torque-Controlled Robot”, (ICRA) 2021, Xi’ an, China.

Abstract

Contact-rich manipulation tasks are commonly found in modern manufacturing settings. However, manually designing a robot controller is considered hard for traditional control methods as the controller requires an effective combination of modalities and vastly different characteristics. In this paper, we first consider incorporating operational space visual and haptic information into a reinforcement learning (RL) method to solve the target uncertainty problems in unstructured environments. Moreover, we propose a novel idea of introducing a proactive action to solve a partially observable Markov decision process (POMDP) problem. With these two ideas, our method can either adapt to reasonable variations in unstructured environments or improve the sample efficiency of policy learning. We evaluated our method on a task that involved inserting a random-access memory (RAM) using a torque-controlled robot and tested the success rates of different baselines used in the traditional methods. We proved that our method is robust and can tolerate environmental variations.

Representation of policies and controller scheme. The blue region is the real-time controller, and the wheat region is the non-real-time trained policy.
Representation of policies and controller scheme. The blue region is the real-time controller, and the wheat region is the non-real-time trained policy.

More details about this paper can be viewed in this video on the Universität Hamburg’s Technical Aspects of Multimodal Systems (TAMS) YouTube channel.

Yunlei was very happy to attend this fantastic conference with support from the project ULTRACEPT.

A Versatile Vision-Pheromone-Communication Platform for Swarm Robotics

Three researchers from the University of Lincoln; Tian Liu, Xuelong Sun, and Qinbing Fu, attended ICRA 2021 in person to present their co-authored paper, ‘A Versatile Vision-Pheromone-Communication Platform for Swarm Robotics’. 

ULTRACEPT researchers Tian Liu, Xuelong Sun and Qinbing Fu attending ICRA 2021
ULTRACEPT researchers Tian Liu, Xuelong Sun and Qinbing Fu attending ICRA 2021

We three were very happy to physically attend this fantastic conference with the support from the project ULTRACEPT.

We have one co-authored paper that presents our developed vision-pheromone-communication platform which was published in the proceedings of this conference. Tian Liu delivered the presentation which outlined our platform and it attracted some attention of attendees through interesting questions asked by the audience. We think that this event has provided us a great opportunity to raise publicity about our platform for future swarm robotics and social insects studies.

Tian Liu presenting at ICRA 2021
Tian Liu presenting at ICRA 2021

A Versatile Vision-Pheromone-Communication Platform for Swarm Robotics, Tian Liu, Xuelong Sun, Cheng Hu, Qinbing Fu, and Shigang Yue, University of Lincoln

Keywords: Biologically-Inspired Robots, Multi-Robot Systems, Swarm Robotics

Abstract: This paper describes a versatile platform for swarm robotics research. It integrates multiple pheromone communication with a dynamic visual scene along with real-time data transmission and localization of multiple-robots. The platform has been built for inquiries into social insect behavior and bio-robotics. By introducing a new research scheme to coordinate olfactory and visual cues, it not only complements current swarm robotics platforms which focus only on pheromone communications by adding visual interaction but also may fill an important gap in closing the loop from bio-robotics to neuroscience. We have built a controllable dynamic visual environment based on our previously developed ColCOSPhi (a multi-pheromones platform) by enclosing the arena with LED panels and interacting with the micro mobile robots with a visual sensor. In addition, a wireless communication system has been developed to allow the transmission of real-time bi-directional data between multiple micro robot agents and a PC host. A case study combining concepts from the internet of vehicles (IoV) and insect-vision inspired model has been undertaken to verify the applicability of the presented platform and to investigate how complex scenarios can be facilitated by making use of this platform.

We have grasped many interesting ideas and inspirations from colleagues in the robotics field from not only the excellent talks but also high-quality robots’ exhibitions from famed companies in the industry.

Conference presentations at ICRA 2021
Conference presentations attended by the researchers at ICRA 2021
Demonstration at the ICRA 2021 conference
Demonstration at the ICRA 2021 conference

On the last day of the conference, we attended a wonderful tour of the Shaanxi History Museum and the Terra-Cotta Warriors, from which we have leaned a lot about the impressive history and culture of Qin dynasty. Further, this also makes us rethink the important role played by science and technology in assisting archaeological excavation and cultural relic protection.

Thanks to the supportive ULTRACEPT project, we really enjoyed the whole event bringing us not only new knowledge about the robotics and history, but enlightening inspirations which will potentially motivate our future researches. In addition, our group’s researching works also have been propagated via this top international conference.

What we can learn from insects: Unveiling insect navigation mechanism

To aid and support the continued collaboration and knowledge exchange of the ULTRACEPT researchers, the consortium hosts online quarterly ‘Sandpit Sessions’. The aim of these sessions is to provide researchers an opportunity to share their work in an informal forum where they can raise and discuss issues and challenges in order to gain support and feedback from the group.

Researchers Xuelong Sun, a student of the University of Lincoln, and Dr Qingbing Fu, Postdoc at Guangzhou University, recently hosted an online Sandpit Session on the 14h May 2021. The theme of the session was ‘What we can learn from insect: Unveil insect navigation mechanism’.

What can we learn from insect: Unveil insect navigation mechanism

Sandpit Session 3:  What we can learn from insects: Unveil insect navigation mechanism

  • Date: Friday, 14th May 2021
  • Time: UK 10:00; China 17:00; Germany 11:00; Argentina 06:00; Malaysia 17:00; Japan 18:00.
  • Facilitators: Xuelong Sun, PhD student, University of Lincoln, Qinbing Fu, Postdoc, Guangzhou University (chair)
  • Location: MS Teams
Sandpit Schedule
UK Time Item Presenters
10:00-10:05 Arrival and welcome Qinbing Fu,

Xuelong Sun

10:05-10:35 Insect navigation

Many insects are highly capable navigators, with abilities that rival those of mammals and other vertebrates. I will give a review of insect navigation from the following three aspects: 1) the rich array of insect navigation behaviours, 2) the known brain regions and neuropils related to navigation tasks and 3) computation models aiming to unravel the neural mechanism of insect navigation. Then, from the computation model point of view, I will report our work filling the current gaps of understanding insect navigation especially the visual navigation and optimal cue integration. Thus, the potentially useful role that computation model plays in understanding biology system will be demonstrated, which closes this session and opens the topic to be discussed in next session.

Xuelong Sun
10:35-11:10 Group discussion about the session topic:

What’s the role of computation model and biorobotics in understanding biology system? 

A group discussion where attendees can raise questions and discuss the topic of research and potential cooperation that was presented.

Facilitated by Qinbing Fu and Xuelong Sun
11.10-11.25 Open forum discussion

An opportunity for attendees to ask the group for advice regarding any challenges they are facing with their own research.

Facilitated by Qinbing Fu and Xuelong Sun
11:25-11:30 Final comments & volunteer for a facilitator for the next session

We are planning our next sandpit session for July 2021.

Xuelong Sun

What can we learn from insect: Unveil insect navigation mechanism

You can learn more about Xuelong’s research on his post about his 12 month ULTRACEPT secondment to Guangzhou University.

Fang Lei completes 12 month secondment at Guangzhou University, China

Fang Lei enrolled as a PhD Scholar at the University of Lincoln in 2019. In early 2020 she visited Guangzhou University as part of the STEP2DYNA project funded by the European Union’s Horizon 2020 Research and Innovation Programme under the Marie Skolodowska-Curie grant agreement. During this secondment Fang Lei was working on developing bio-inspired visual systems for collision detection in dim light environments. More recently, Fang continued this work during her 12 month secondment at Guangzhou University under ULTRACEPT from May 2020 to 2021.

During the secondment to Guangzhou University, I was working on developing bio-inspired visual systems for collision detection in dim light environments. For the autonomous navigation of vehicles or robots, it is a challenging task to detect moving objects in extremely low-light conditions due to very low signal-to-noise ratios (SNRs). However, nocturnal insects possess remarkable visual abilities in perceiving motion cues and detecting moving objects in very dim light environments. There are many studies on the night vision of insects’ visual systems, which provide us with a lot of inspirations for enhancing motion cues and modelling an artificial visual system to detect motion like looming objects. Fig. 1 shows an example image of looming motion in a dim light environment which is from the low-light video motion (LLVM) dataset obtained by the experimental devices (see Fig. 2).

Fig. 1 An example image of looming motion
Fig. 1 An example image of looming motion
Fig. 2 Experimental devices
Fig. 2 Experimental devices

In order to develop more ideas and experiences in my modelling work, I discussed this with other colleagues and Prof. Peng (see Fig. 3) and got very useful suggestions. We discussed mainly the biological modelling of direction selectivity of LGMD1. We also organized a group seminar every week to discuss the related problems we encounter in our research projects, and I also got a lot of valuable experiences on bio-inspired modelling by sharing our ideas.

Fang with Prof. Peng and colleagues at Guangzhou
Fang with Prof. Peng and colleagues at Guangzhou

For my research work, collision detection in a dim light environment includes the modelling work of direction selectivity of LGMD1 neuron and the motion cues enhancement. I have developed the new LGMD1 model which is effective in distinguishing looming motion from translating motion. I have published one conference paper and attended the online virtual conference (IJCNN 2021, see Fig. 4). I also submitted one journal paper to IEEE transactions on neural networks and learning systems (NNLS) which is under review. Additionally, I have finished the modelling work of motion cues enhancement and proposed a new model. Fig. 5 shows the enhancement results of the captured dark image sequences during testing experiments.

Fig. 4 Online virtual conference of IJCNN2021
Fig. 4 Online virtual conference of IJCNN2021
Fig.5 Testing captured dark image sequences and the experimental results
Fig.5 Testing captured dark image sequences and the experimental results

During this 12-month secondment, I have a better knowledge of bio-inspired modelling and obtain a lot of exercises of connection between theory and practice.  I established good friendships with my colleagues through frequent communications in every week’s group seminar, which provide a basis for future cooperation. The secondment was a very precious experience for me. Many thanks to ULTRACEPT project for supporting my research work and providing me with the opportunity to work together with my colleagues.

Fang Lei at GZHU
Fang Lei at GZHU

Hongxin Wang Completes 12 Month Secondment at Guangzhou University

Hongxin Wang received his PhD in computer science from the University of Lincoln in 2020. Following a secondment under the STEP2DYNA project, Dr Wang carried out a further secondment under the ULTRACEPT project from April 2020 to April 2021 at partner Guangzhou University. Here, he undertook research contributing to work packages 1 and 2. Dr Wang’s ULTRACEPT contributions have involved directing the research into computational modelling of motion vision neural systems for small target motion detection. 

University of Lincoln’s Experienced Researcher Dr Hongxin Wang recently completed a 12 month secondment at ULTRACEPT project partner Guangzhou University in China. The project is funded by the European Union’s Horizon 2020 Research and Innovation Program under the Marie Skolodowska-Curie grant agreement. Dr Wang visited Guangzhou from April 2020 to April 2021 and contributed to Work Package 1 and 2.

Dr Wang reflects on what he has achieved during secondment

Monitoring moving objects against complex natural backgrounds is a huge challenge to future robotic vision systems, and even more so when attempting to detect small targets only a few pixels in size, for example, an unmanned aerial vehicle (UAV) or a bird in the distance, as shown in Fig. 1. Surprisingly, insects are quite apt at searching for mates and tracking prey, which appears as small dim speckles in the visual field. The exquisite sensitivity of insects for small target motion comes from a class of specific neurons called small target motion detectors (STMDs). Building a quantitative STMD model is the first step for not only further understanding the biological visual system but also providing robust and economic solutions of small target detection for an artificial vision system.

Fig. 1. Examples of small moving targets. (a) A unmanned aerial vehicle (UAV), and (b) a bird in the distance where their surrounding regions are enlarged in the red boxes. Both the UAV and bird appear as dim speckles with only a few pixels in size where most of visual features are difficult to discern. In particular, they all show extremely low contrast against the complex background.

During this twelve-month secondment, I continued my previous work on modeling insects’ visual systems for small target detection and have made great progress. Specifically, we proposed a STMD-based model with time-delay feedback to achieve superior detection performance for fast-moving small targets, whilst significantly suppressing background false positive movements which display lower velocities. This work has been submitted to IEEE Transactions on Neural Networks and Learning Systems and is currently under review. In addition, we developed an attention-prediction guided visual system to overcome the heavy dependency of the existing models on target contrast to background, as illustrated in Fig. 2. The paper presenting this work has been completed and will be submitted to IEEE Transactions on Cybernetics.

Fig. 2. Overall flowchart of the proposed attention and prediction guided visual system. It consists of a preprocessing module (left), an attention module (top), a STMD-based neural network (right), a prediction module (bottom), and a memorizer (middle).

During my 12 month secondment at Guangzhou University, I obtained inspiration and mathematical theory support from Professor Jigen Peng to design the STMD-based visual systems. We organized a seminar every week to discuss the latest biological findings, explore effective neural modeling methods, and develop specialised mathematical theory for bioinspired motion detection. Significant progress was made under the help of Professor Jigen Peng.

Hongxin Wang on secondment at Guangzhou University
Hongxin Wang on secondment at Guangzhou University

The secondment has also provided me with an opportunity to improve my mathematical ability with support from Professor Peng. Strong mathematical ability helps me better describe the insects’ visual systems, and build robust neural models for small target motion detection. In addition, I established a deep friendship with Professor Peng and my colleagues at Guangzhou University, which is providing me a basis for future research collaborations. Lastly, I introduced our research to colleagues during the discussion, which may attract their attention to our research field and finally boost the development of neural system modelling.

The secondment has been an excellent experience for me and provided me the opportunity to collaborate with my project colleagues. Thank you for the support from the ULTRACEPT project which benefited me a lot.

Nikolas Andreakos Attends CVML Short Course: Machine Learning and Deep Neural Networks

Nikolas Andreakos is a PhD candidate at the University of Lincoln, working on developing computational models of associative memory formation and recognition in the mammalian hippocampus. Nikolas attended the CVML Short Course: Machine Learning and Deep Neural Networks which took place as a live web course on 17-18th February 2021. This short course was hosted by Aristotle University of Thessaloniki (AUTH).

Nikolas Andreakos Attends CVML Short Course: Machine Learning and Deep Neural Networks

About CVML Short Course: Machine Learning and Deep Neural Networks

This two-day short course focused on Machine Learning and Deep Neural Network theory, their applications in the below-mentioned diverse domains, and new challenges ahead.

  • Autonomous Systems (cars, drones, vessels),
  • Media Content and Art Creation (including fake data creation/detection), Social Media Analytics,
  • Medical Imaging and Diagnosis,
  • Financial Engineering (forecasting and analytics), Big Data Analytics,
  • Broadcasting, Internet and Communications,
  • Robotics/Control
  • Intelligent Human-Machine Interaction, Anthropocentric (human-centred) Computing,
  • Smart Cities/Buildings and Assisted living.
  • Scientific Modeling and Analytics.

Nikolas Andreakos Attends CVML Short Course: Machine Learning and Deep Neural Networks

The course consisted of two parts (A, B), and each of them included 8 one-hour lectures and related material (slide pdf, lecture videos, understanding questionnaires).

Part A lectures (8 hours) provided an in-depth presentation of Deep Neural Networks, which are at the forefront of AI advances today, starting with an introduction to Machine Learning.

Part B lectures (8 hours) provided a fan in-depth presentation of Machine Learning to complement DNNs.

Nikolas Andreakos Attends CVML Short Course: Machine Learning and Deep Neural Networks

You can find more details in the following link: https://icarus.csd.auth.gr/cvml-short-course-machine-learning-and-deep-neural-networks/

Nikolas Andreakos Attends CVML Short Course: Machine Learning and Deep Neural Networks

When asked about his experience, Nikolas said:

“Overall, it was a very well structured course and helped me to extend and strengthen my knowledge in ML and DL. Now I feel more comfortable using this knowledge in my project since computational neuroscience and ML could drive each other forward by using ideas from one domain to another”.

Below the certificate which was provided to Nikolas after he completed 16 questionnaires.

Nikolas Andreakos Attends CVML Short Course: Machine Learning and Deep Neural Networks

Workshop 3

Focussing on developments in brain inspired hazard perception, reporting on multiple modality neural computation for collision detection, developments on neural vision chip structure and miniaturisation of the systems, preliminary results, relevant to WP2, WP3.

The ULTRACEPT Workshop Three was hosted as an international conference by ULTRACEPT partner Guangzhou University (GZHU). It took place over two days on Thursday 25th & Friday 26th March 2021. Due to the travel restrictions caused by COVID-19, the workshop was held in person by the research group at GZHU and as an online event using MS Teams. 40 researchers attended the sessions.

ULTRACEPT workshop 3
GZHU researchers attending the ULTRACEPT workshop 3
ULTRACEPT workshop 3
ULTRACEPT workshop 3

International Workshop on Bio-Inspired Computation & Bio-Robotics (BICBIR 2021)

ULTRACEPT: Ultra-layered perception with brain-inspired information processing for vehicle collision avoidance: Workshop 3 & Annual Board Meeting

Date: Thursday 25th & Friday 26th March 2021

Location: Guangzhou University, Room 603, Block 2, Innovation Garden, Guangzhou HEMC North, Guangzhou, China 510006 and online MS Teams video conference

Day 1

Date: Thursday, 25 March 2021

Time: UK 10:00; China 18:00; Germany 11:00; Buenos Aires 07:00; Malaysia 18:00; Japan 19:00

Facilitator: Dr Qinbing Fu

UK time/ China time Item Presenters
10:00-10:10/

18:00-18:10

Arrival and welcome

 

Prof Shigang Yue
10:10-11:10/

18:10-19:10

Ant-inspired celestial compass yields new opportunities for localization

40 minutes presentation & 20 minutes Q&A

Julien Serres was born in Aix-en-Provence, France. He obtained a MSc degree in Electronics, Electrotechnics, and Automatic Control Engineering from Paris-Saclay University and the École Normale Supérieure Paris-Saclay, France, in 2003. In 2003, he joined the Biorobotics Group at the Institute of Movement Sciences, a joint research unit: CNRS and Aix Marseille University, Marseille, France, under the supervision of Dr Nicolas Franceschini. He obtained his Ph.D. degree at the University of Montpellier in 2008.

After spending 8 years as teacher in Applied Physics (from 2006 to 2014) at the French Department of Education for training qualified technicians (2 years technical degree in Electrotechnics after the A level), he is at present a senior lecturer at the Biorobotics Group.

He is the author or co-author of 90 publications, including 1 patent and 20 indexed journal papers (h-index WoS = 10; h-index Google Scholar = 15). He has already co-supervised 8 PhD students, and his current research interests include biorobotics, bio-inspired visual sensors, insects’ ethology, and the development of bio-inspired autopilot aimed at equipping autonomous robots.

Dr Julien Serres
11:10-11:25/

19:10-19:25

Break: Non-ULTRACEPT attendees to please leave the session in preparation for the ULTRACEPT board meeting commencing.
11:25-11:55/

19:25-19:55

ULTRACEPT annual board meeting: Review of period 1 activities

Attendees: ULTRACEPT board members

Prof Shigang Yue
11:55-12:50/

19:55-20:50

ULTRACEPT annual board meeting: Round table discussion

Attendees: ULTRACEPT board members and EU Project Officer, Irina Tiron.

All
12:50-13:00/

20:50-21:00

ULTRACEPT annual board meeting: Final comments All
The meeting was opened with an introduction by Prof Shigang Yue, the ULTRACEPT project Coordinator from the University of Lincoln (UoL). Following this was a presentation by guest speaker Dr Julien Serres from the University of Montpellier. Dr Serres presented ‘Ant-inspired celestial compass yields new opportunities for localization’.
Julien Serres presenting at ULTRACEPT workshop 3
Julien Serres presenting at ULTRACEPT workshop 3

Following Dr Serres presentation was the annual ULTRACEPT board meeting where the group was joined by the EU Project Officer, Irina Tiron.

Day 2

Date: Friday, 26 March 2021

Time: UK 10:00; China 18:00; Germany 11:00; Buenos Aires 07:00; Malaysia 18:00; Japan 19:00

Facilitator: Dr Qinbing Fu

UK time/ China time Item Presenters
10:00-10:10/

18:00-18:10

 

Arrival and welcome

 

Prof Jigen Peng & Prof Shigang Yue
10:10-11:10/

18:10-19:10

 

Stereoscopic vision with an insect brain. How the praying mantis estimates depth.

40 minutes presentation & 20 minutes Q&A

Ronny Rosner is guest researcher at the Biosciences Institute of Newcastle University, United Kingdom, and member of the Centre for Behaviour and Evolution. In 2003 he received a Diploma in Biology from Rostock University, Germany, where he participated in the development of Biosensorchips for drug testing. He then switched fields to basic research in neurobiology. Ronny studies the small but sophisticated brains of insects.

In 2009 Ronny received his PhD from Bielefeld University, Germany. In the laboratories of Professor Martin Egelhaaf and Anne Kathrin Warzecha he worked on the variability of information processing for motion vision in blowflies. He discovered that visual processing for gaze stabilisation depends largely on the behavioural state of the animals.

After completion of his thesis, Ronny worked as a research associate and university teacher at the University of Marburg, Germany. With Prof Uwe Homberg, Ronny studied the neuronal substrate for long range navigation in locusts. He discovered that neurons in a major brain area for spatial orientation, the central complex, are not only sensitive to polarised light but also to visual motion. He also discovered that the activity of these neurons changes when the animals are walking as opposed to when they are standing still.

In 2014 Ronny went on to Newcastle University where he worked as a research associate in the group of Prof Jenny Read. There, he established neurophysiology and worked on the neuronal substrate for stereoscopic vision in praying mantids. He discovered the first neurons for stereoscopic vision in an invertebrate. The achievement was recognized in scientific and public media internationally. Ronny is also member of the community that establishes the Drosophila connectome, a map of all neurons in the fruit fly brain including all synaptic connections. More recently Ronny became interested in translating findings from insect neurobiology to machine vision.

Dr Ronny Rosner
11:10-11:40/

19:10-19:40

A versatile vision-pheromone-communication platform for swarm robotics

University of Lincoln

–          20 minutes presentation & 10 minutes Q&A

Tian Liu
11:40-11:55/

19:40-19:55

Break
11:55-12:25

19:55-20:25

Pan-sharpening of Remote Sensing Images Based on Deep Neural Networks

Guangzhou University

–        20 minutes presentation & 10 minutes Q&A

Dr Changsheng Zhou
12:25-12:55/

20:25-20:55

Implementing Refractoriness in LGMD Model: Challenges, Methods and Results

University of Lincoln

–          20 minutes presentation & 10 minutes Q&A

Mu Hua
12:55-13:10/

20:55-21:10

Final comments Prof Jigen Peng & Prof Shigang Yue
Day 2 of the ULTRACEPT Workshop 3 was headed by a presentation from Dr Ronny Rosner from the University of Newcastle (UNEW) who discussed ‘Stereoscopic vision with an insect brain. How the praying mantis estimates depth’.
Ronny Rosner presenting at ULTRACEPT workshop 3
Ronny Rosner presenting at ULTRACEPT workshop 3

Following Dr Rosner’s presentation was Tian Liu from UoL who presented on ‘A versatile vision-pheromone-communication platform for swarm robotics’.

ULTRACEPT workshop 3
Tian Liu presenting at the ULTRACEPT workshop 3
Tian Liu presenting at ULTRACEPT workshop 3
Tian Liu presenting at ULTRACEPT workshop 3

Dr Changsheng Zhou from host GZHU presented next with ‘Pan-sharpening of Remote Sensing Images Based on Deep Neural Networks’.

ULTRACEPT workshop 3
Dr Changsheng Zhou from host GZHU presenting at the ULTRACEPT Workshop 3
Dr Changsheng Zhou presenting at ULTRACEPT workshop 3
Dr Changsheng Zhou presenting at ULTRACEPT workshop 3

The final presentation for the day was from UoL researcher Mu Hua. Mu is currently on an ULTRACEPT secondment at GZHU and presented his research ‘Implementing Refractoriness in LGMD Model: Challenges, Methods and Results’.

Mu Hua presenting at ULTRACEPT workshop 3
Mu Hua presenting at ULTRACEPT workshop 3