All posts by comitchell

Nikolas Andreakos Attends CVML Short Course: Machine Learning and Deep Neural Networks

Nikolas Andreakos is a PhD candidate at the University of Lincoln, working on developing computational models of associative memory formation and recognition in the mammalian hippocampus. Nikolas attended the CVML Short Course: Machine Learning and Deep Neural Networks which took place as a live web course on 17-18th February 2021. This short course was hosted by Aristotle University of Thessaloniki (AUTH).

Nikolas Andreakos Attends CVML Short Course: Machine Learning and Deep Neural Networks

About CVML Short Course: Machine Learning and Deep Neural Networks

This two-day short course focused on Machine Learning and Deep Neural Network theory, their applications in the below-mentioned diverse domains, and new challenges ahead.

  • Autonomous Systems (cars, drones, vessels),
  • Media Content and Art Creation (including fake data creation/detection), Social Media Analytics,
  • Medical Imaging and Diagnosis,
  • Financial Engineering (forecasting and analytics), Big Data Analytics,
  • Broadcasting, Internet and Communications,
  • Robotics/Control
  • Intelligent Human-Machine Interaction, Anthropocentric (human-centred) Computing,
  • Smart Cities/Buildings and Assisted living.
  • Scientific Modeling and Analytics.

Nikolas Andreakos Attends CVML Short Course: Machine Learning and Deep Neural Networks

The course consisted of two parts (A, B), and each of them included 8 one-hour lectures and related material (slide pdf, lecture videos, understanding questionnaires).

Part A lectures (8 hours) provided an in-depth presentation of Deep Neural Networks, which are at the forefront of AI advances today, starting with an introduction to Machine Learning.

Part B lectures (8 hours) provided a fan in-depth presentation of Machine Learning to complement DNNs.

Nikolas Andreakos Attends CVML Short Course: Machine Learning and Deep Neural Networks

You can find more details in the following link: https://icarus.csd.auth.gr/cvml-short-course-machine-learning-and-deep-neural-networks/

Nikolas Andreakos Attends CVML Short Course: Machine Learning and Deep Neural Networks

When asked about his experience, Nikolas said:

“Overall, it was a very well structured course and helped me to extend and strengthen my knowledge in ML and DL. Now I feel more comfortable using this knowledge in my project since computational neuroscience and ML could drive each other forward by using ideas from one domain to another”.

Below the certificate which was provided to Nikolas after he completed 16 questionnaires.

Nikolas Andreakos Attends CVML Short Course: Machine Learning and Deep Neural Networks

Workshop 3

Focussing on developments in brain inspired hazard perception, reporting on multiple modality neural computation for collision detection, developments on neural vision chip structure and miniaturisation of the systems, preliminary results, relevant to WP2, WP3.

The ULTRACEPT Workshop Three was hosted as an international conference by ULTRACEPT partner Guangzhou University (GZHU). It took place over two days on Thursday 25th & Friday 26th March 2021. Due to the travel restrictions caused by COVID-19, the workshop was held in person by the research group at GZHU and as an online event using MS Teams. 40 researchers attended the sessions.

ULTRACEPT workshop 3
GZHU researchers attending the ULTRACEPT workshop 3
ULTRACEPT workshop 3
ULTRACEPT workshop 3

International Workshop on Bio-Inspired Computation & Bio-Robotics (BICBIR 2021)

ULTRACEPT: Ultra-layered perception with brain-inspired information processing for vehicle collision avoidance: Workshop 3 & Annual Board Meeting

Date: Thursday 25th & Friday 26th March 2021

Location: Guangzhou University, Room 603, Block 2, Innovation Garden, Guangzhou HEMC North, Guangzhou, China 510006 and online MS Teams video conference

Day 1

Date: Thursday, 25 March 2021

Time: UK 10:00; China 18:00; Germany 11:00; Buenos Aires 07:00; Malaysia 18:00; Japan 19:00

Facilitator: Dr Qinbing Fu

UK time/ China time Item Presenters
10:00-10:10/

18:00-18:10

Arrival and welcome

 

Prof Shigang Yue
10:10-11:10/

18:10-19:10

Ant-inspired celestial compass yields new opportunities for localization

40 minutes presentation & 20 minutes Q&A

Julien Serres was born in Aix-en-Provence, France. He obtained a MSc degree in Electronics, Electrotechnics, and Automatic Control Engineering from Paris-Saclay University and the École Normale Supérieure Paris-Saclay, France, in 2003. In 2003, he joined the Biorobotics Group at the Institute of Movement Sciences, a joint research unit: CNRS and Aix Marseille University, Marseille, France, under the supervision of Dr Nicolas Franceschini. He obtained his Ph.D. degree at the University of Montpellier in 2008.

After spending 8 years as teacher in Applied Physics (from 2006 to 2014) at the French Department of Education for training qualified technicians (2 years technical degree in Electrotechnics after the A level), he is at present a senior lecturer at the Biorobotics Group.

He is the author or co-author of 90 publications, including 1 patent and 20 indexed journal papers (h-index WoS = 10; h-index Google Scholar = 15). He has already co-supervised 8 PhD students, and his current research interests include biorobotics, bio-inspired visual sensors, insects’ ethology, and the development of bio-inspired autopilot aimed at equipping autonomous robots.

Dr Julien Serres
11:10-11:25/

19:10-19:25

Break: Non-ULTRACEPT attendees to please leave the session in preparation for the ULTRACEPT board meeting commencing.
11:25-11:55/

19:25-19:55

ULTRACEPT annual board meeting: Review of period 1 activities

Attendees: ULTRACEPT board members

Prof Shigang Yue
11:55-12:50/

19:55-20:50

ULTRACEPT annual board meeting: Round table discussion

Attendees: ULTRACEPT board members and EU Project Officer, Irina Tiron.

All
12:50-13:00/

20:50-21:00

ULTRACEPT annual board meeting: Final comments All
The meeting was opened with an introduction by Prof Shigang Yue, the ULTRACEPT project Coordinator from the University of Lincoln (UoL). Following this was a presentation by guest speaker Dr Julien Serres from the University of Montpellier. Dr Serres presented ‘Ant-inspired celestial compass yields new opportunities for localization’.
Julien Serres presenting at ULTRACEPT workshop 3
Julien Serres presenting at ULTRACEPT workshop 3

Following Dr Serres presentation was the annual ULTRACEPT board meeting where the group was joined by the EU Project Officer, Irina Tiron.

Day 2

Date: Friday, 26 March 2021

Time: UK 10:00; China 18:00; Germany 11:00; Buenos Aires 07:00; Malaysia 18:00; Japan 19:00

Facilitator: Dr Qinbing Fu

UK time/ China time Item Presenters
10:00-10:10/

18:00-18:10

 

Arrival and welcome

 

Prof Jigen Peng & Prof Shigang Yue
10:10-11:10/

18:10-19:10

 

Stereoscopic vision with an insect brain. How the praying mantis estimates depth.

40 minutes presentation & 20 minutes Q&A

Ronny Rosner is guest researcher at the Biosciences Institute of Newcastle University, United Kingdom, and member of the Centre for Behaviour and Evolution. In 2003 he received a Diploma in Biology from Rostock University, Germany, where he participated in the development of Biosensorchips for drug testing. He then switched fields to basic research in neurobiology. Ronny studies the small but sophisticated brains of insects.

In 2009 Ronny received his PhD from Bielefeld University, Germany. In the laboratories of Professor Martin Egelhaaf and Anne Kathrin Warzecha he worked on the variability of information processing for motion vision in blowflies. He discovered that visual processing for gaze stabilisation depends largely on the behavioural state of the animals.

After completion of his thesis, Ronny worked as a research associate and university teacher at the University of Marburg, Germany. With Prof Uwe Homberg, Ronny studied the neuronal substrate for long range navigation in locusts. He discovered that neurons in a major brain area for spatial orientation, the central complex, are not only sensitive to polarised light but also to visual motion. He also discovered that the activity of these neurons changes when the animals are walking as opposed to when they are standing still.

In 2014 Ronny went on to Newcastle University where he worked as a research associate in the group of Prof Jenny Read. There, he established neurophysiology and worked on the neuronal substrate for stereoscopic vision in praying mantids. He discovered the first neurons for stereoscopic vision in an invertebrate. The achievement was recognized in scientific and public media internationally. Ronny is also member of the community that establishes the Drosophila connectome, a map of all neurons in the fruit fly brain including all synaptic connections. More recently Ronny became interested in translating findings from insect neurobiology to machine vision.

Dr Ronny Rosner
11:10-11:40/

19:10-19:40

A versatile vision-pheromone-communication platform for swarm robotics

University of Lincoln

–          20 minutes presentation & 10 minutes Q&A

Tian Liu
11:40-11:55/

19:40-19:55

Break
11:55-12:25

19:55-20:25

Pan-sharpening of Remote Sensing Images Based on Deep Neural Networks

Guangzhou University

–        20 minutes presentation & 10 minutes Q&A

Dr Changsheng Zhou
12:25-12:55/

20:25-20:55

Implementing Refractoriness in LGMD Model: Challenges, Methods and Results

University of Lincoln

–          20 minutes presentation & 10 minutes Q&A

Mu Hua
12:55-13:10/

20:55-21:10

Final comments Prof Jigen Peng & Prof Shigang Yue
Day 2 of the ULTRACEPT Workshop 3 was headed by a presentation from Dr Ronny Rosner from the University of Newcastle (UNEW) who discussed ‘Stereoscopic vision with an insect brain. How the praying mantis estimates depth’.
Ronny Rosner presenting at ULTRACEPT workshop 3
Ronny Rosner presenting at ULTRACEPT workshop 3

Following Dr Rosner’s presentation was Tian Liu from UoL who presented on ‘A versatile vision-pheromone-communication platform for swarm robotics’.

ULTRACEPT workshop 3
Tian Liu presenting at the ULTRACEPT workshop 3
Tian Liu presenting at ULTRACEPT workshop 3
Tian Liu presenting at ULTRACEPT workshop 3

Dr Changsheng Zhou from host GZHU presented next with ‘Pan-sharpening of Remote Sensing Images Based on Deep Neural Networks’.

ULTRACEPT workshop 3
Dr Changsheng Zhou from host GZHU presenting at the ULTRACEPT Workshop 3
Dr Changsheng Zhou presenting at ULTRACEPT workshop 3
Dr Changsheng Zhou presenting at ULTRACEPT workshop 3

The final presentation for the day was from UoL researcher Mu Hua. Mu is currently on an ULTRACEPT secondment at GZHU and presented his research ‘Implementing Refractoriness in LGMD Model: Challenges, Methods and Results’.

Mu Hua presenting at ULTRACEPT workshop 3
Mu Hua presenting at ULTRACEPT workshop 3

Universität Hamburg Hosts ULTRACEPT Sandpit Session – Robotic Grasping based on Deep Learning: Towards the Robust Perception

To aid and support the continued collaboration and knowledge exchange of the ULTRACEPT researchers, the consortium hosts online quarterly ‘Sandpit Sessions’. The aim of these sessions is to provide researchers an opportunity to share their work in an informal forum where they can raise and discuss issues and challenges in order to gain support and feedback from the group.

Researchers Hongzhuo Liang and Shuang Li from the Universität Hamburg (UHAM) recently hosted a Sandpit Session on the 5th March 2021. The theme of the session was Robotic Grasping based on Deep Learning: Towards the Robust Perception. 28 attendees across the consortium participated.

Universität Hamburg Hosts ULTRACEPT Sandpit Session - Robotic Grasping based on Deep Learning: Towards the Robust Perception
Hongzhuo Liang presenting at the ULTRACEPT Sandpit Session

Sandpit Session 2: Robotic Grasping based on Deep Learning: Towards the Robust Perception

  • Date: Friday, 5th March 2021
  • Time: UK 10:00; China 18:00; Germany 11:00; Argentina 07:00; Malaysia 18:00; Japan 19:00.
  • Facilitators: Hongzhuo Liang, PhD student, Universität Hamburg and Shuang Li, PhD student, Universität Hamburg
  • Location: MS Teams
Sandpit Schedule
UK Time Item Presenters
10:00-10:05 Arrival and welcome

 

Shuang Li
10:05-10:35 Robotic Grasping based on Deep Learning: Towards the Robust Perception

Autonomous robotic grasping is a challenging task, which includes many respects of research areas, e.g., perception, robotics, and psychology. In this talk, I will review the state of art in robotic grasping. Then, I will report the current working progress of my grasping work: from two-finger grasping to multi-finger grasping based on deep learning methods. This session will encourage an open-minded way for facilitating quick idea-exchanging.

Hongzhuo Liang
10:35-11:10 Group discussion about the session topic

How do bio-inspired methods help to design a better robotic grasping agent?

A group discussion where attendees can raise questions and discuss the topic of research and potential cooperation that was presented.

Facilitated by Shuang Li
11.10-11.25 Open forum discussion

An opportunity for attendees to ask the group for advice regarding any challenges they are facing with their own research.

Facilitated by Shuang Li

 

11:25-11:30 Final comments & volunteer for a facilitator for the next session

We are planning our next sandpit session for May 2021.

Shuang Li
Universität Hamburg Hosts ULTRACEPT Sandpit Session - Robotic Grasping based on Deep Learning: Towards the Robust Perception
Hongzhuo Liang and Shuang Li discussing their research ideas with the attendees

More detailed information about Hongshuo Liang’s research publications and code can be found on his website.

To find out more about the fascinating work being carried out by the team at UHAM, check out their YouTube channel TAMS UHAM.

Dr Qinbing Fu Completes 12 Month Secondment in China

Dr Qinbing Fu received his PhD at University of Lincoln, in October 2018. Following a secondment under the STEP2DYNA project, Dr Fu carried out a further secondment under the ULTRACEPT project from August 2019 to August 2020 at partner Guangzhou University. Here he undertook research contributing to work packages 1 and 4. Dr Fu then went on to work as a postdoctoral researcher with Professor Shigang Yue until January 2021. Dr Fu’s ULTRACEPT contributions have involved directing the research into computational modelling of motion vision neural systems and applications on robotics. His research achievements and outputs for this project thus far is outlined in this blog post.

Research Outcomes

In support of the ULTRACEPT project, Dr Fu has published seven research papers including five journal papers and two conference papers. He was the first author on five of the publications and co-authored the other two. His main achievements have included:

  • The modelling of LGMD-1 and LGMD-2 collision perception neural network models with applications on robot and vehicle scenarios;
  • The modelling of Drosophila motion vision neural system for decoding the direction of foreground translating object in moving cluttered background;
  • A review on the related field of research;
  • Multiple neural system models integration for collision sensing.
Dr Qinbing Fu's ULTRACEPT research activities. Using the Colias robots for modelling work on collision detection visual systems.
Using the Colias robots for modelling work on collision detection visual systems.

Dr Fu’s research outputs can be found on his personal web pages on Google Scholar and ResearchGate. In addition, Qinbing directed promising research on building visually dynamic walls in an arena to test the on-board visual system. These research ideas have been collated and summarised in his research papers.

Dr Fu’s research contributions have fully supported ULTRACEPT’s WP1 and WP4. This includes modelling work on collision detection visual systems with systematic experiments on vehicle scenarios and also the integration of multiple neural system models for motion perception.

Using the Colias robots for modelling work on collision detection visual systems.
Using the Colias robots for modelling work on collision detection visual systems.

Secondment at Guangzhou University, China

Dr Fu carried out his ULTRACEPT secondment at project partner GZHU in China where he worked with Professor Jigen Peng. During this period he developed his capability on several aspects, becoming a more mature researcher in the academic community. This included: aspiring to progressive research ideas, collaboration with group members on completing research papers, coordinating teamwork, disseminating the project, good communication experience with global partners, and writing project proposals. Undoubtedly, the ULTRACEPT secondment for Dr Fu has been very successful.

Dissemination Activities

Dr Fu has undertaken a number of dissemination activities to promote the ULTRACEPT research outcomes. On the 28th July 2020, he presented his research at the ULTRACEPT online Workshop 2 on the topic of “Adaptive Inhibition Matters to Robust Collision Perception in Highly Variable Environments”. At this event, he exchanged ideas with project partners.

Qinbing Fu presents at ULTRACEPT Workshop 2
Dr Fu presents at ULTRACEPT Workshop 2

Dr Fu also facilitated a ULTRACEPT online Sandpit Session on 27 November 2020, during where he gave a talk on “Past, Present, and Future Modelling on Bio-Inspired Collision Detection Visual Systems: Towards the Robust Perception”.

Dr Fu presents at ULTRACEPT's First Sand Pit Session
Dr Fu presents at ULTRACEPT’s First Sand Pit Session

On 18th December 2020, Dr Fu attended the 2020 IEEE International Conference on Advanced Robotics and Mechatronics (ICARM) held in Shenzhen, China where he presented his research paper entitled “Complementary visual neuronal systems model for collision sensing”. He also chaired a Session on “Biomimetics” during this conference.

Qinbing Fu presents his research at the IEEE ARM 2020 Conference
Dr Qinbing Fu presents his research at the IEEE ARM 2020 Conference
Qinbing Fu presents his research at the IEEE ARM 2020 Conference
Qinbing Fu presents his research at the IEEE ARM 2020 Conference

Tian Liu Completes 12 Month Secondment at Guangzhou University

Tian Liu enrolled as a PhD Scholar at the University of Lincoln in 2018. In 2018-2019 he visited Guangzhou University as part of the STEP2DYNA project funded by the European Union’s Horizon 2020 Research and Innovation Programme under the Marie Skolodowska-Curie grant agreement. During this secondment Tian Liu developed the ColCOSΦ experiment platform for social insects and swarm robotic researching. Tian investigated how multiple virtual pheromones impact on the swarm robots. More recently, Tian completed a 12 month secondment under ULTRACEPT at Guangzhou University.

Tian Liu recently completed his second 12 month secondment at project partner Guangzhou University in China as part of the ULTRACEPT project funded by the European Union’s Horizon 2020 Research and Innovation Programme under the Marie Skolodowska-Curie grant agreement. Tian visited Guangzhou from November 2019 to November 2020 and has been involved in Work Package 1 and 4.

Tian reflects on what he has achieved during his time in Guangzhou

Most social insects, such as ants, only have a tiny brain. However, they can complete very difficult and complex tasks with a large number of individuals cooperating. Examples include building a large nest or collecting food through rugged routes. They are able to do this because the pheromones act as an important communication medium.

During this 12 month secondment, I continued to focus my attention on swarm robots with multiple pheromones. I believed that it is the interaction of multiple pheromones that enables insects to perform such demanding tasks, rather than the single pheromone mechanism which is now so widely studied. I worked with ULTRACEPT researcher Xuelong Sun and Dr. Cheng Hu to develop the ColCOSΦ, which can easily implement multiple pheromone research experiments. We verified the application and evaluation of the effects of multi-pheromones in swarm robotics by implementing several case studies which simulated ants foraging and carrying out hunting and deployment tasks.

I showcased the outcomes of this research at both ICARM2019 and ICARM2020 international conferences.

Tian Liu Completes 12 Month Secondment at Guangzhou University

Tian Liu presenting ICARM 2020
Tian Liu presenting ICARM 2020

Due to its excellent scalability, we also use it for research experiments in related fields. For example, the platform can simulate traffic scenarios so we can test our LGMD model (a collision detection model) by using the micro robot (Colias) in a low-cost way.

Tian Liu Completes 12 Month Secondment at Guangzhou University

Besides olfactory, the visual information is also a very important input for insects, so we implemented a changeable visual environment on the ColCOSΦ for investigating how to make full use of both olfactory and visual information in a swarm task. The research was collated into two articles which have been submitted to ICRA2021 with fellow ULTRACEPT researchers Xuelong Sun, Dr Qinbing Fu and Dr Cheng Hu.

Tian Liu Completes 12 Month Secondment at Guangzhou University

The secondment has been an excellent experience for me and my colleagues and provided me the opportunity to collaborate with my project colleagues.

Many thanks to ULTRACEPT project for supporting my research and for allowing me to work with these outstanding research scholars.

Yair Barnatan from UBA attends XXXV Annual Meeting of the Argentinian Society for Neuroscience Research

Yair Barnatan is an ULTRACEPT PhD student, at the University of Buenos Aires, working in the field of neuroethology. He is currently focused on neuronal processing of optic flow in the crustacean visual system, unravelling which and how neurons are involved in this process.

Yair attended the XXXV Annual Meeting of the Argentinian Society for Neuroscience Research, SAN 2020. This event was held virtually due to the global pandemic from 7th to 9th October, 2020.

This congress covered a wide variety of neuroscience topics, such as sensory and motor systems, neurodegenerative diseases and learning and memory. In that meeting, Yair presented a poster entitled “Functional evidence of the crustacean lobula plate as optic flow processing center” (Barnatan, Y., Tomsic, D. & Sztarker, J.)

Yair Barnatan from UBA attends XXXV Annual Meeting of the Argentinian Society for Neuroscience Research
Yair Barnatan from UBA attends XXXV Annual Meeting of the Argentinian Society for Neuroscience Research

Abstract

When an animal rotates it produces wide field image motion over its retina, termed optic flow (OF). OF blurs the image compromising the ability to see. Image shifts are stabilized by compensatory behaviors collectively termed optomotor response (OR). In most vertebrates and decapod crustaceans such reflex behavior involves mainly eye movements that consists in a slow tracking phase of the wide field image motion followed by a fast-resetting phase. We used the mud crab Neohelice granulata to tackle a major question in crustacean’s visual processing: which region of the brain is the neural substrate for processing OF? It has long been known that dipteran lobula plate (3rd optic neuropil) is the center involved in processing OF information. Recently, a crustacean lobula plate was characterized by neuroanatomical techniques, sharing many canonical features with the dipteran neuropil. In this work we present a functional evaluation of the role of crab’s lobula plate on the compensatory eye movements to rotational OF by performing electrolytic lesion experiments. We show that lesioning the lobula plate greatly impairs OR while keeping intact other visually guided behaviors, such as avoidance response upon an approaching stimulus. Even when OR is present in some lobula plate lesioned animals, these show reduced speed of eye tracking. Altogether, these results present strong evidence about an evolutionary conserved site for processing optic flow shared by crustacean and insects.

Yair Barnatan from UBA attends XXXV Annual Meeting of the Argentinian Society for Neuroscience Research

ULTRACEPT Consortium Holds Its First Sand Pit Session

To aid and support the continued collaboration and knowledge exchange of the ULTRACEPT researchers, the consortium has commenced online quarterly ‘Sandpit Sessions’. The aim of these sessions is to provide researchers an opportunity to share their work in an informal forum where they can raise and discuss issues and challenges in order to gain support and feedback from the group.

As the session is facilitated by a researcher, it also provides them with a professional development opportunity to gain experience in facilitating an international group workshop.

The first session was held 27th November 2020 and hosted on MS Teams and facilitated by UoL researcher Dr Qinbing Fu. Dr Fu’s session focused on Past, Present, and Future Modelling on Bio-Inspired Collision Detection Visual Systems: Towards the Robust Perception. 23 attendees across the consortium participated.

Dr Fu presents at ULTRACEPT's First Sand Pit Session
Dr Fu presents at ULTRACEPT’s First Sand Pit Session

Sandpit Session 1: Past, Present, and Future Modelling on Bio-Inspired Collision Detection Visual Systems: Towards the Robust Perception

  • Date: Friday, 27th November 2020
  • Time: UK 10:00; China 18:00; Germany 11:00; Argentina 07:00; Malaysia 18:00; Japan 19:00.
  • Facilitator: Dr Qinbing Fu, Postdoctoral Researcher, University of Lincoln
  • Location: MS Teams
Sandpit Schedule
UK Time Item Presenters
10:00-10:05 Arrival and welcome

 

Qinbing Fu
10:05-10:35 Past, Present, and Future Modelling on Bio-Inspired Collision Detection Visual Systems: Towards the Robust Perception

In this talk, I report on past, present, and future modelling on bio-inspired collision detection visual systems towards the robust perception in real-world challenging scenarios, from my perspective. The emphasis of this talk is firstly laid on a brief review on the development of such visual systems in the recent decades, specifically the typical insect-inspired collision sensing visual systems. After that, I will articulate the current challenges, to discuss with you promising solutions and worthwhile future effort on improving the modelling, consolidating the link between neuroscience and computational modelling. This session will encourage an open-minded way for facilitating quick idea-exchanging.

Qinbing Fu
10:35-11:00 Group discussion about the session topic

A group discussion where attendees can raise questions and discuss the topic of research that was presented.

Facilitated by Qinbing Fu
11.00-11.25 Open forum discussion

An opportunity for attendees to ask the group for advice regarding any challenges they are facing with their own research.

Facilitated by Qinbing Fu

 

11:25-11:30 Final comments & volunteer for a facilitator for the next session

We are planning our next sandpit session for January 2021.

Qinbing Fu

The session was scheduled for 1.5 hours and focused on the topic ‘Past, Present, and Future Modelling on Bio-Inspired Collision Detection Visual Systems: Towards the Robust Perception’.

Dr Fu presents at ULTRACEPT's First Sand Pit Session
Dr Fu presents at ULTRACEPT’s First Sand Pit Session

Following Dr Fu’s presentation was a group discussion on the subject matter then and open forum discussion which provided attendees an opportunity for Q&A.

These Sandpit sessions enable the consortium members to continue their collaborative working and knowledge exchange, particularly during the COVID-19 travel restrictions where all activities have needed to be virtual.

Huatian Wang Publishes Paper in Neural Networks

Huatian Wang enrolled as a PhD scholar at the University of Lincoln in January 2017. During his PhD, he carried out a 12-month secondment as an Early-Stage Researcher for the European Union’s Horizon 2020 STEP2DYNA (691154) project from 2017-18 at Tsinghua University. Following this, Huatian carried further secondments under the European Union’s Horizon 2020 ULTRACEPT (778062) project from 2019-2020. This included 1 month at Guangzhou University (GZHU), then 11 months at Xi’an Jiaotong University (XJTU). His research areas include image processing, insect vision and motion detection.

University of Lincoln researcher Huatian Wang recently published a paper titled “A bioinspired angular velocity decoding neural network model for visually guided flights” on Neural Networks. Neural Networks is the archival journal of the world’s three oldest neural modeling societies: the International Neural Network Society (INNS), the European Neural Network Society (ENNS), and the Japanese Neural Network Society (JNNS). It has a significant influence on neuroscience, especially on cognitive neuroscience.

Huatian Wang publishes paper in Neural Networks

About the Paper

Efficient and robust motion perception systems are important pre-requisites for achieving visually guided flights in future micro air vehicles. As a source of inspiration, the visual neural networks of flying insects such as honeybee and Drosophila provide ideal examples on which to base artificial motion perception models. In our paper “A bioinspired angular velocity decoding neural network model for visually guided flights”, we have used this approach to develop a novel method that solves the fundamental problem of estimating angular velocity for visually guided flights.

Compared with previous models, our elementary motion detector (EMD) based model uses a separate texture estimation pathway to effectively decode angular velocity, and demonstrates considerable independence from the spatial frequency and contrast of the gratings.

Using the Unity development platform the model is further tested for tunnel centering and terrain following paradigms in order to reproduce the visually guided flight behaviors of honeybees. In a series of controlled trials, the virtual bee utilizes the proposed angular velocity control schemes to accurately navigate through a patterned tunnel, maintaining a suitable distance from the undulating textured terrain. The results are consistent with both neuron spike recordings and behavioral path recordings of real honeybees, thereby demonstrating the model’s potential for implementation in micro air vehicles which only have visual sensors.

About the Research Experience

Huatian shares his recent research experience which contributed to this publication. 

2020 was a difficult year for all of us. After a one-year secondment in China funded by the EU HORIZON 2020 project, ULTRACEPT, I had to stay in China due to the travel restriction. Thanks to the university’s policy, I could apply to work remotely at home to continue my research. My supervisor, Prof Shigang Yue, organized an online group meeting every week so that we could talk with each other freely. This benefited my study a lot and I was able to make progress every week and update my research regularly.

Publication on Neural Networks is an encouragement for me to continue my research on modeling visual systems of insects. Thanks for the support I received from the ULTRACEPT project and for the kind support from my supervisor Prof Shigang and my research colleagues.

Huatian Wang publishes paper in Neural Networks
A Group Meeting Photo

This paper is available as open access:

Huatian Wang, Qinbing Fu, Honxing Wang, Paul Baxter, Jigen Peng, Shigang Yue, A bioinspired angular velocity decoding neural network model for visually guided flights, Neural Networks,
2020, ISSN 0893-6080, https://doi.org/10.1016/j.neunet.2020.12.008. (http://www.sciencedirect.com/science/article/pii/S0893608020304251)

Xuelong Sun presents at Neuromatch Conference March 2020

Based on the successful mind-matching session at the Cognitive Computational Neuroscience (CCN) conference, a free web-based unconference for neuroscientists was created called “neuromatch“.

The neuromatch 1.0 conference was held on 30th and 31st March, 2020. The conference agenda included a significant number of international speakers.

Our ULTRACEPT researcher Xuelong Sun presented his work on insect navigation at the conference. Considering the current travel restrictions caused by Covid-19, this was an excellent opportunity to continue to promote the ULTRACEPT project work in an innovative, safe and effective way.

Neuromatch agenda image

Xuelong presented his work ‘A Decentralised Neural Model Explaining Optimal Integration Of Navigational Strategies in Insects’. Xuelong is carrying out this work with Dr Michael Mangan and Prof Shigang Yue.

A copy of Xuelong’s presentation can be accessed here.

To learn more about this research, please refer to the paper Modelling the Insect Navigation Toolkit: How the Mushroom Bodies and Central Complex Coordinate Guidance Strategies https://doi.org/10.1101/856153 .

Neuromatch conference agenda Mar 20
Xuelong Sun’s presentation Neuromatch conference agenda Mar 20

Development of an Angular Velocity Decoding Model Accounting for Honeybees’ Visually Guided Flights

Huatian Wang received his BSc and MSc degree in Applied Mathematics from Xi’an Jiaotong University in 2014 and 2017, respectively. He was awarded the Marie Curie Fellowship to be involved in the EU FP7 project LIVCODE (295151) as a Research Assistant in 2016.

Huatian enrolled as a PhD scholar at the University of Lincoln in January 2017. During his PhD, he carried out a 12 month secondment as an Early-Stage Researcher for the European Union’s Horizon 2020 STEP2DYNA (691154) project from 2017-18 at Tsinghua University. Following this, Huatian carried further secondments under the European Union’s Horizon 2020 ULTRACEPT (778062) project from 2019-2020. This included 1 month at Guangzhou University (GZHU), then 11 months at Xi’an Jiaotong University (XJTU). His research areas include image processing, insect vision and motion detection.Huatian Wang

I was mainly involved in the ULTRACEPT Work Package 1. The research focuses on modelling the visual processing systems of the flying insects like Drosophila and honeybees. Their extraordinary navigation ability in cluttered environments provide perfect inspiration for designing artificial neural networks. It can be used to guide the visual flight of micro air vehicles.

Although insects like flies and honeybees have tiny brains, they can deal with very complex visual flight tasks. Research has been undertaken for decades to understand how they detect visual motion. However, the neural mechanisms to explain their variety of behaviours, including patterned tunnel centring and terrain following, are still not clear. According to the honeybee behavioural experiments performed, the key to their excellent flight control ability is the angular velocity estimation and regulation.

To solve the fundamental problem of the angular velocity estimation, we proposed a novel angular velocity decoding model for explaining the honeybee’s flight behaviours of tunnel centring and terrain following, capable of reproducing observations of the large independence to the spatial frequency and contrast of the gratings in visually guide flights of honeybees. The model combines both temporal and texture information to decode the angular velocity. The angular velocity estimation of the model is little affected by the spatial frequency and contrast in synthetic grating experiments. The model is also tested behaviourally in Unity with the tunnel centring and terrain following paradigms. A demo video can be found on YouTube here. The simulated bee flies over a textured terrain using only ventral visual information to avoid collision.

During my secondment, I presented a poster as part of our work at the IJCNN 2019 conference in Budapest which you can read about here. This gave me the opportunity to share my research with the scientific community at the conference. The picture shows the communication I had with other researchers during the poster session.

Huatian Wang attending the International Joint Conference on Neural Networks (IJCNN) July 2019
Huatian Wang attending the International Joint Conference on Neural Networks (IJCNN) July 2019

I also attended and presented my work at the ULTRACEPT mid-term meeting in February 2020 which you can read about here. Due to Covid-19 travel restrictions, I was not able to attend the event in person. Instead, I attended and presented via video conference.

Huatian Wang presenting at the ULTRACEPT mid-term meeting Feb 2020
Huatian Wang presenting at the ULTRACEPT mid-term meeting Feb 2020

These secondments have provided me with the opportunity to work with leading academics in this field of research. For example, I was able to discuss the mathematical model of elementary motion detection and the signal simulation using sinusoidal gratings with Prof. Jigen Peng at GZHU, as well as the sparse reconstruction method in compressing sensing theory with Dr. Angang Cui at XJTU.

I also worked alongside fellow researchers. For example, I helped Dr. Qinbing to build up a database about the Collision Detection in various automotive scenes. We collected videos using a dashboard camera and made suitable cuts using video editing software.

I also attended numerous seminars and guest lectures. For example,  I attended a seminar on solving sparse linear system using smooth approximation methods. These experiences helped me to  develop my skills and knowledge and to further my research.

During the final two months of my secondment I had to work from my home in China since the university closed due to Covid-19. However, I was able to use this time to carry out video conference discussions with my supervisors both in Xian and Lincoln. I also used my desktop computer to run simulation experiments and spent time preparing academic research papers.

Thanks to the support of the ULTRACEPT project, I was able to introduce our work to other groups and attract their attention to this research field, which is helpful for improving the impact of our research.

During my one-year secondment in China, I established a friendship with Prof. Peng and other colleagues at Guangzhou University and Xi’an Jiaotong University. The cooperation with colleagues of these institutions boosted the development of the neural modelling for visual navigation. I was also able to introduce ULTRACEPT Project to other researchers in GU and XJTU. The mathematical analysing ability has been significantly improved during the cooperation with Prof. Peng. The programming ability has also been improved with my colleagues’ help.