All posts by comitchell

ULTRACEPT Researchers Present at IEEE ICRA 2021

The 2021 International Conference on Robotics and Automation (IEEE ICRA 2021) was held in Xi’an, China from 31st May to 4th June 2021. As one of the premier and top conferences in the field of robotics and automation, this great event has gathered thousands of excellent researchers from all over the world. Due to the pandemic, the conference was held in a hybrid format, including physical on-site and virtual cloud meetings.

ULTRACEPT researchers Tian Liu, Xuelong Sun and Qinbing Fu attending ICRA 2021
ULTRACEPT researchers Tian Liu, Xuelong Sun and Qinbing Fu attending ICRA 2021

We three were very happy to physically attend this fantastic conference with the support from the project ULTRACEPT.

We have one co-authored paper that presents our developed vision-pheromone-communication platform which was published in the proceedings of this conference. Tian Liu delivered the presentation which outlined our platform and it attracted some attention of attendees through interesting questions asked by the audience. We think that this event has provided us a great opportunity to raise publicity about our platform for future swarm robotics and social insects studies.

Tian Liu presenting at ICRA 2021
Tian Liu presenting at ICRA 2021

A Versatile Vision-Pheromone-Communication Platform for Swarm Robotics, Tian Liu, Xuelong Sun, Cheng Hu, Qinbing Fu, and Shigang Yue, University of Lincoln

Keywords: Biologically-Inspired Robots, Multi-Robot Systems, Swarm Robotics

Abstract: This paper describes a versatile platform for swarm robotics research. It integrates multiple pheromone communication with a dynamic visual scene along with real-time data transmission and localization of multiple-robots. The platform has been built for inquiries into social insect behavior and bio-robotics. By introducing a new research scheme to coordinate olfactory and visual cues, it not only complements current swarm robotics platforms which focus only on pheromone communications by adding visual interaction but also may fill an important gap in closing the loop from bio-robotics to neuroscience. We have built a controllable dynamic visual environment based on our previously developed ColCOSPhi (a multi-pheromones platform) by enclosing the arena with LED panels and interacting with the micro mobile robots with a visual sensor. In addition, a wireless communication system has been developed to allow the transmission of real-time bi-directional data between multiple micro robot agents and a PC host. A case study combining concepts from the internet of vehicles (IoV) and insect-vision inspired model has been undertaken to verify the applicability of the presented platform and to investigate how complex scenarios can be facilitated by making use of this platform.

We have grasped many interesting ideas and inspirations from colleagues in the robotics field from not only the excellent talks but also high-quality robots’ exhibitions from famed companies in the industry.

Conference presentations at ICRA 2021
Conference presentations attended by the researchers at ICRA 2021
Demonstration at the ICRA 2021 conference
Demonstration at the ICRA 2021 conference

On the last day of the conference, we attended a wonderful tour of the Shaanxi History Museum and the Terra-Cotta Warriors, from which we have leaned a lot about the impressive history and culture of Qin dynasty. Further, this also makes us rethink the important role played by science and technology in assisting archaeological excavation and cultural relic protection.

Thanks to the supportive ULTRACEPT project, we really enjoyed the whole event bringing us not only new knowledge about the robotics and history, but enlightening inspirations which will potentially motivate our future researches. In addition, our group’s researching works also have been propagated via this top international conference.

What we can learn from insects: Unveiling insect navigation mechanism

To aid and support the continued collaboration and knowledge exchange of the ULTRACEPT researchers, the consortium hosts online bi-monthly ‘Sandpit Sessions’. The aim of these sessions is to provide researchers an opportunity to share their work in an informal forum where they can raise and discuss issues and challenges in order to gain support and feedback from the group.

Researchers Xuelong Sun, a student of the University of Lincoln, and Dr Qingbing Fu, Postdoc at Guangzhou University, recently hosted an online Sandpit Session on the 14h May 2021. The theme of the session was ‘What we can learn from insect: Unveil insect navigation mechanism’.

What can we learn from insect: Unveil insect navigation mechanism

Sandpit Session 3:  What we can learn from insects: Unveil insect navigation mechanism

  • Date: Friday, 14th May 2021
  • Time: UK 10:00; China 17:00; Germany 11:00; Argentina 06:00; Malaysia 17:00; Japan 18:00.
  • Facilitators: Xuelong Sun, PhD student, University of Lincoln, Qinbing Fu, Postdoc, Guangzhou University (chair)
  • Location: MS Teams
Sandpit Schedule
UK Time Item Presenters
10:00-10:05 Arrival and welcome Qinbing Fu,

Xuelong Sun

10:05-10:35 Insect navigation

Many insects are highly capable navigators, with abilities that rival those of mammals and other vertebrates. I will give a review of insect navigation from the following three aspects: 1) the rich array of insect navigation behaviours, 2) the known brain regions and neuropils related to navigation tasks and 3) computation models aiming to unravel the neural mechanism of insect navigation. Then, from the computation model point of view, I will report our work filling the current gaps of understanding insect navigation especially the visual navigation and optimal cue integration. Thus, the potentially useful role that computation model plays in understanding biology system will be demonstrated, which closes this session and opens the topic to be discussed in next session.

Xuelong Sun
10:35-11:10 Group discussion about the session topic:

What’s the role of computation model and biorobotics in understanding biology system? 

A group discussion where attendees can raise questions and discuss the topic of research and potential cooperation that was presented.

Facilitated by Qinbing Fu and Xuelong Sun
11.10-11.25 Open forum discussion

An opportunity for attendees to ask the group for advice regarding any challenges they are facing with their own research.

Facilitated by Qinbing Fu and Xuelong Sun
11:25-11:30 Final comments & volunteer for a facilitator for the next session

We are planning our next sandpit session for July 2021.

Xuelong Sun

What can we learn from insect: Unveil insect navigation mechanism

You can learn more about Xuelong’s research on his post about his 12 month ULTRACEPT secondment to Guangzhou University.

Fang Lei completes 12 month secondment at Guangzhou University, China

Fang Lei enrolled as a PhD Scholar at the University of Lincoln in 2019. In early 2020 she visited Guangzhou University as part of the STEP2DYNA project funded by the European Union’s Horizon 2020 Research and Innovation Programme under the Marie Skolodowska-Curie grant agreement. During this secondment Fang Lei was working on developing bio-inspired visual systems for collision detection in dim light environments. More recently, Fang continued this work during her 12 month secondment at Guangzhou University under ULTRACEPT from May 2020 to 2021.

During the secondment to Guangzhou University, I was working on developing bio-inspired visual systems for collision detection in dim light environments. For the autonomous navigation of vehicles or robots, it is a challenging task to detect moving objects in extremely low-light conditions due to very low signal-to-noise ratios (SNRs). However, nocturnal insects possess remarkable visual abilities in perceiving motion cues and detecting moving objects in very dim light environments. There are many studies on the night vision of insects’ visual systems, which provide us with a lot of inspirations for enhancing motion cues and modelling an artificial visual system to detect motion like looming objects. Fig. 1 shows an example image of looming motion in a dim light environment which is from the low-light video motion (LLVM) dataset obtained by the experimental devices (see Fig. 2).

Fig. 1 An example image of looming motion
Fig. 1 An example image of looming motion
Fig. 2 Experimental devices
Fig. 2 Experimental devices

In order to develop more ideas and experiences in my modelling work, I discussed this with other colleagues and Prof. Peng (see Fig. 3) and got very useful suggestions. We discussed mainly the biological modelling of direction selectivity of LGMD1. We also organized a group seminar every week to discuss the related problems we encounter in our research projects, and I also got a lot of valuable experiences on bio-inspired modelling by sharing our ideas.

Fang with Prof. Peng and colleagues at Guangzhou
Fang with Prof. Peng and colleagues at Guangzhou

For my research work, collision detection in a dim light environment includes the modelling work of direction selectivity of LGMD1 neuron and the motion cues enhancement. I have developed the new LGMD1 model which is effective in distinguishing looming motion from translating motion. I have published one conference paper and attended the online virtual conference (IJCNN 2021, see Fig. 4). I also submitted one journal paper to IEEE transactions on neural networks and learning systems (NNLS) which is under review. Additionally, I have finished the modelling work of motion cues enhancement and proposed a new model. Fig. 5 shows the enhancement results of the captured dark image sequences during testing experiments.

Fig. 4 Online virtual conference of IJCNN2021
Fig. 4 Online virtual conference of IJCNN2021
Fig.5 Testing captured dark image sequences and the experimental results
Fig.5 Testing captured dark image sequences and the experimental results

During this 12-month secondment, I have a better knowledge of bio-inspired modelling and obtain a lot of exercises of connection between theory and practice.  I established good friendships with my colleagues through frequent communications in every week’s group seminar, which provide a basis for future cooperation. The secondment was a very precious experience for me. Many thanks to ULTRACEPT project for supporting my research work and providing me with the opportunity to work together with my colleagues.

Fang Lei at GZHU
Fang Lei at GZHU

Hongxin Wang Completes 12 Month Secondment at Guangzhou University

Hongxin Wang received his PhD in computer science from the University of Lincoln in 2020. Following a secondment under the STEP2DYNA project, Dr Wang carried out a further secondment under the ULTRACEPT project from April 2020 to April 2021 at partner Guangzhou University. Here, he undertook research contributing to work packages 1 and 2. Dr Wang’s ULTRACEPT contributions have involved directing the research into computational modelling of motion vision neural systems for small target motion detection. 

University of Lincoln’s Experienced Researcher Dr Hongxin Wang recently completed a 12 month secondment at ULTRACEPT project partner Guangzhou University in China. The project is funded by the European Union’s Horizon 2020 Research and Innovation Program under the Marie Skolodowska-Curie grant agreement. Dr Wang visited Guangzhou from April 2020 to April 2021 and contributed to Work Package 1 and 2.

Dr Wang reflects on what he has achieved during secondment

Monitoring moving objects against complex natural backgrounds is a huge challenge to future robotic vision systems, and even more so when attempting to detect small targets only a few pixels in size, for example, an unmanned aerial vehicle (UAV) or a bird in the distance, as shown in Fig. 1. Surprisingly, insects are quite apt at searching for mates and tracking prey, which appears as small dim speckles in the visual field. The exquisite sensitivity of insects for small target motion comes from a class of specific neurons called small target motion detectors (STMDs). Building a quantitative STMD model is the first step for not only further understanding the biological visual system but also providing robust and economic solutions of small target detection for an artificial vision system.

Fig. 1. Examples of small moving targets. (a) A unmanned aerial vehicle (UAV), and (b) a bird in the distance where their surrounding regions are enlarged in the red boxes. Both the UAV and bird appear as dim speckles with only a few pixels in size where most of visual features are difficult to discern. In particular, they all show extremely low contrast against the complex background.

During this twelve-month secondment, I continued my previous work on modeling insects’ visual systems for small target detection and have made great progress. Specifically, we proposed a STMD-based model with time-delay feedback to achieve superior detection performance for fast-moving small targets, whilst significantly suppressing background false positive movements which display lower velocities. This work has been submitted to IEEE Transactions on Neural Networks and Learning Systems and is currently under review. In addition, we developed an attention-prediction guided visual system to overcome the heavy dependency of the existing models on target contrast to background, as illustrated in Fig. 2. The paper presenting this work has been completed and will be submitted to IEEE Transactions on Cybernetics.

Fig. 2. Overall flowchart of the proposed attention and prediction guided visual system. It consists of a preprocessing module (left), an attention module (top), a STMD-based neural network (right), a prediction module (bottom), and a memorizer (middle).

During my 12 month secondment at Guangzhou University, I obtained inspiration and mathematical theory support from Professor Jigen Peng to design the STMD-based visual systems. We organized a seminar every week to discuss the latest biological findings, explore effective neural modeling methods, and develop specialised mathematical theory for bioinspired motion detection. Significant progress was made under the help of Professor Jigen Peng.

Hongxin Wang on secondment at Guangzhou University
Hongxin Wang on secondment at Guangzhou University

The secondment has also provided me with an opportunity to improve my mathematical ability with support from Professor Peng. Strong mathematical ability helps me better describe the insects’ visual systems, and build robust neural models for small target motion detection. In addition, I established a deep friendship with Professor Peng and my colleagues at Guangzhou University, which is providing me a basis for future research collaborations. Lastly, I introduced our research to colleagues during the discussion, which may attract their attention to our research field and finally boost the development of neural system modelling.

The secondment has been an excellent experience for me and provided me the opportunity to collaborate with my project colleagues. Thank you for the support from the ULTRACEPT project which benefited me a lot.

Annual Board Meeting March 2021

The ULTRACEPT annual board meeting was hosted by ULTRACEPT partner Guangzhou University (GZHU). Due to the travel restrictions caused by COVID-19, the workshop was held in person by the research group at GZHU and as an online event using MS Teams on Thursday 25th March 2021. The meeting was combined with the ULTRACEPT Workshop 3 event. Attendees included students, researchers, academic staff, partner leads, and the project manager. The EU Project Officer Irina Tiron joined the group for the board meeting component of the event.

ULTRACEPT annual board meeting
ULTRACEPT annual board meeting

The mid-term meeting provided partners with an opportunity to engage in a fruitful and constructive dialogue between the consortium and the Research Executive Agency. The impact of COVID-19 on the project was also discussed as well as strategies to overcome the impact of the ongoing travel restrictions on the secondments. Thus far, the consortium has been able to progress the work packages by working remotely during lockdowns and also keeping in touch and collaborating via online platforms.

ULTRACEPT workshop 3
GZHU researchers attending the ULTRACEPT meeting

Nikolas Andreakos Attends CVML Short Course: Machine Learning and Deep Neural Networks

Nikolas Andreakos is a PhD candidate at the University of Lincoln, working on developing computational models of associative memory formation and recognition in the mammalian hippocampus. Nikolas attended the CVML Short Course: Machine Learning and Deep Neural Networks which took place as a live web course on 17-18th February 2021. This short course was hosted by Aristotle University of Thessaloniki (AUTH).

Nikolas Andreakos Attends CVML Short Course: Machine Learning and Deep Neural Networks

About CVML Short Course: Machine Learning and Deep Neural Networks

This two-day short course focused on Machine Learning and Deep Neural Network theory, their applications in the below-mentioned diverse domains, and new challenges ahead.

  • Autonomous Systems (cars, drones, vessels),
  • Media Content and Art Creation (including fake data creation/detection), Social Media Analytics,
  • Medical Imaging and Diagnosis,
  • Financial Engineering (forecasting and analytics), Big Data Analytics,
  • Broadcasting, Internet and Communications,
  • Robotics/Control
  • Intelligent Human-Machine Interaction, Anthropocentric (human-centred) Computing,
  • Smart Cities/Buildings and Assisted living.
  • Scientific Modeling and Analytics.

Nikolas Andreakos Attends CVML Short Course: Machine Learning and Deep Neural Networks

The course consisted of two parts (A, B), and each of them included 8 one-hour lectures and related material (slide pdf, lecture videos, understanding questionnaires).

Part A lectures (8 hours) provided an in-depth presentation of Deep Neural Networks, which are at the forefront of AI advances today, starting with an introduction to Machine Learning.

Part B lectures (8 hours) provided a fan in-depth presentation of Machine Learning to complement DNNs.

Nikolas Andreakos Attends CVML Short Course: Machine Learning and Deep Neural Networks

You can find more details in the following link: https://icarus.csd.auth.gr/cvml-short-course-machine-learning-and-deep-neural-networks/

Nikolas Andreakos Attends CVML Short Course: Machine Learning and Deep Neural Networks

When asked about his experience, Nikolas said:

“Overall, it was a very well structured course and helped me to extend and strengthen my knowledge in ML and DL. Now I feel more comfortable using this knowledge in my project since computational neuroscience and ML could drive each other forward by using ideas from one domain to another”.

Below the certificate which was provided to Nikolas after he completed 16 questionnaires.

Nikolas Andreakos Attends CVML Short Course: Machine Learning and Deep Neural Networks

Workshop 3

Focussing on developments in brain-inspired hazard perception, reporting on multiple modality neural computation for collision detection, developments on neural vision chip structure and miniaturisation of the systems, preliminary results, relevant to WP2, WP3 and WP4.

The ULTRACEPT Workshop Three was hosted as an international conference by ULTRACEPT partner Guangzhou University (GZHU). It took place over two days on Thursday 25th & Friday 26th March 2021. Due to the travel restrictions caused by COVID-19, the workshop was held in person by the research group at GZHU and as an online event using MS Teams. 40 researchers attended the sessions.

ULTRACEPT workshop 3
GZHU researchers attending the ULTRACEPT workshop 3
ULTRACEPT workshop 3
ULTRACEPT workshop 3

International Workshop on Bio-Inspired Computation & Bio-Robotics (BICBIR 2021)

ULTRACEPT: Ultra-layered perception with brain-inspired information processing for vehicle collision avoidance: Workshop 3 & Annual Board Meeting

Date: Thursday 25th & Friday 26th March 2021

Location: Guangzhou University, Room 603, Block 2, Innovation Garden, Guangzhou HEMC North, Guangzhou, China 510006 and online MS Teams video conference

Day 1

Date: Thursday, 25 March 2021

Time: UK 10:00; China 18:00; Germany 11:00; Buenos Aires 07:00; Malaysia 18:00; Japan 19:00

Facilitator: Dr Qinbing Fu

UK time/ China time Item Presenters
10:00-10:10/

18:00-18:10

Arrival and welcome

 

Prof Shigang Yue
10:10-11:10/

18:10-19:10

Ant-inspired celestial compass yields new opportunities for localization

40 minutes presentation & 20 minutes Q&A

Julien Serres was born in Aix-en-Provence, France. He obtained a MSc degree in Electronics, Electrotechnics, and Automatic Control Engineering from Paris-Saclay University and the École Normale Supérieure Paris-Saclay, France, in 2003. In 2003, he joined the Biorobotics Group at the Institute of Movement Sciences, a joint research unit: CNRS and Aix Marseille University, Marseille, France, under the supervision of Dr Nicolas Franceschini. He obtained his Ph.D. degree at the University of Montpellier in 2008.

After spending 8 years as teacher in Applied Physics (from 2006 to 2014) at the French Department of Education for training qualified technicians (2 years technical degree in Electrotechnics after the A level), he is at present a senior lecturer at the Biorobotics Group.

He is the author or co-author of 90 publications, including 1 patent and 20 indexed journal papers (h-index WoS = 10; h-index Google Scholar = 15). He has already co-supervised 8 PhD students, and his current research interests include biorobotics, bio-inspired visual sensors, insects’ ethology, and the development of bio-inspired autopilot aimed at equipping autonomous robots.

Dr Julien Serres
11:10-11:25/

19:10-19:25

Break: Non-ULTRACEPT attendees to please leave the session in preparation for the ULTRACEPT board meeting commencing.
11:25-11:55/

19:25-19:55

ULTRACEPT annual board meeting: Review of period 1 activities

Attendees: ULTRACEPT board members

Prof Shigang Yue
11:55-12:50/

19:55-20:50

ULTRACEPT annual board meeting: Round table discussion

Attendees: ULTRACEPT board members and EU Project Officer, Irina Tiron.

All
12:50-13:00/

20:50-21:00

ULTRACEPT annual board meeting: Final comments All
The meeting was opened with an introduction by Prof Shigang Yue, the ULTRACEPT project Coordinator from the University of Lincoln (UoL). Following this was a presentation by guest speaker Dr Julien Serres from the University of Montpellier. Dr Serres presented ‘Ant-inspired celestial compass yields new opportunities for localization’.
Julien Serres presenting at ULTRACEPT workshop 3
Julien Serres presenting at ULTRACEPT workshop 3

Following Dr Serres presentation was the annual ULTRACEPT board meeting where the group was joined by the EU Project Officer, Irina Tiron.

Day 2

Date: Friday, 26 March 2021

Time: UK 10:00; China 18:00; Germany 11:00; Buenos Aires 07:00; Malaysia 18:00; Japan 19:00

Facilitator: Dr Qinbing Fu

UK time/ China time Item Presenters
10:00-10:10/

18:00-18:10

 

Arrival and welcome

 

Prof Jigen Peng & Prof Shigang Yue
10:10-11:10/

18:10-19:10

 

Stereoscopic vision with an insect brain. How the praying mantis estimates depth.

40 minutes presentation & 20 minutes Q&A

Ronny Rosner is guest researcher at the Biosciences Institute of Newcastle University, United Kingdom, and member of the Centre for Behaviour and Evolution. In 2003 he received a Diploma in Biology from Rostock University, Germany, where he participated in the development of Biosensorchips for drug testing. He then switched fields to basic research in neurobiology. Ronny studies the small but sophisticated brains of insects.

In 2009 Ronny received his PhD from Bielefeld University, Germany. In the laboratories of Professor Martin Egelhaaf and Anne Kathrin Warzecha he worked on the variability of information processing for motion vision in blowflies. He discovered that visual processing for gaze stabilisation depends largely on the behavioural state of the animals.

After completion of his thesis, Ronny worked as a research associate and university teacher at the University of Marburg, Germany. With Prof Uwe Homberg, Ronny studied the neuronal substrate for long range navigation in locusts. He discovered that neurons in a major brain area for spatial orientation, the central complex, are not only sensitive to polarised light but also to visual motion. He also discovered that the activity of these neurons changes when the animals are walking as opposed to when they are standing still.

In 2014 Ronny went on to Newcastle University where he worked as a research associate in the group of Prof Jenny Read. There, he established neurophysiology and worked on the neuronal substrate for stereoscopic vision in praying mantids. He discovered the first neurons for stereoscopic vision in an invertebrate. The achievement was recognized in scientific and public media internationally. Ronny is also member of the community that establishes the Drosophila connectome, a map of all neurons in the fruit fly brain including all synaptic connections. More recently Ronny became interested in translating findings from insect neurobiology to machine vision.

Dr Ronny Rosner
11:10-11:40/

19:10-19:40

A versatile vision-pheromone-communication platform for swarm robotics

University of Lincoln

–          20 minutes presentation & 10 minutes Q&A

Tian Liu
11:40-11:55/

19:40-19:55

Break
11:55-12:25

19:55-20:25

Pan-sharpening of Remote Sensing Images Based on Deep Neural Networks

Guangzhou University

–        20 minutes presentation & 10 minutes Q&A

Dr Changsheng Zhou
12:25-12:55/

20:25-20:55

Implementing Refractoriness in LGMD Model: Challenges, Methods and Results

University of Lincoln

–          20 minutes presentation & 10 minutes Q&A

Mu Hua
12:55-13:10/

20:55-21:10

Final comments Prof Jigen Peng & Prof Shigang Yue
Day 2 of the ULTRACEPT Workshop 3 was headed by a presentation from Dr Ronny Rosner from the University of Newcastle (UNEW) who discussed ‘Stereoscopic vision with an insect brain. How the praying mantis estimates depth’.
Ronny Rosner presenting at ULTRACEPT workshop 3
Ronny Rosner presenting at ULTRACEPT workshop 3

Following Dr Rosner’s presentation was Tian Liu from UoL who presented on ‘A versatile vision-pheromone-communication platform for swarm robotics’.

ULTRACEPT workshop 3
Tian Liu presenting at the ULTRACEPT workshop 3
Tian Liu presenting at ULTRACEPT workshop 3
Tian Liu presenting at ULTRACEPT workshop 3

Dr Changsheng Zhou from host GZHU presented next with ‘Pan-sharpening of Remote Sensing Images Based on Deep Neural Networks’.

ULTRACEPT workshop 3
Dr Changsheng Zhou from host GZHU presenting at the ULTRACEPT Workshop 3
Dr Changsheng Zhou presenting at ULTRACEPT workshop 3
Dr Changsheng Zhou presenting at ULTRACEPT workshop 3

The final presentation for the day was from UoL researcher Mu Hua. Mu is currently on an ULTRACEPT secondment at GZHU and presented his research ‘Implementing Refractoriness in LGMD Model: Challenges, Methods and Results’.

Mu Hua presenting at ULTRACEPT workshop 3
Mu Hua presenting at ULTRACEPT workshop 3

Universität Hamburg Hosts ULTRACEPT Sandpit Session – Robotic Grasping based on Deep Learning: Towards the Robust Perception

To aid and support the continued collaboration and knowledge exchange of the ULTRACEPT researchers, the consortium hosts online bi-monthly ‘Sandpit Sessions’. The aim of these sessions is to provide researchers an opportunity to share their work in an informal forum where they can raise and discuss issues and challenges in order to gain support and feedback from the group.

Researchers Hongzhuo Liang and Shuang Li from the Universität Hamburg (UHAM) recently hosted a Sandpit Session on the 5th March 2021. The theme of the session was Robotic Grasping based on Deep Learning: Towards the Robust Perception. 28 attendees across the consortium participated.

Universität Hamburg Hosts ULTRACEPT Sandpit Session - Robotic Grasping based on Deep Learning: Towards the Robust Perception
Hongzhuo Liang presenting at the ULTRACEPT Sandpit Session

Sandpit Session 2: Robotic Grasping based on Deep Learning: Towards the Robust Perception

  • Date: Friday, 5th March 2021
  • Time: UK 10:00; China 18:00; Germany 11:00; Argentina 07:00; Malaysia 18:00; Japan 19:00.
  • Facilitators: Hongzhuo Liang, PhD student, Universität Hamburg and Shuang Li, PhD student, Universität Hamburg
  • Location: MS Teams
Sandpit Schedule
UK Time Item Presenters
10:00-10:05 Arrival and welcome

 

Shuang Li
10:05-10:35 Robotic Grasping based on Deep Learning: Towards the Robust Perception

Autonomous robotic grasping is a challenging task, which includes many respects of research areas, e.g., perception, robotics, and psychology. In this talk, I will review the state of art in robotic grasping. Then, I will report the current working progress of my grasping work: from two-finger grasping to multi-finger grasping based on deep learning methods. This session will encourage an open-minded way for facilitating quick idea-exchanging.

Hongzhuo Liang
10:35-11:10 Group discussion about the session topic

How do bio-inspired methods help to design a better robotic grasping agent?

A group discussion where attendees can raise questions and discuss the topic of research and potential cooperation that was presented.

Facilitated by Shuang Li
11.10-11.25 Open forum discussion

An opportunity for attendees to ask the group for advice regarding any challenges they are facing with their own research.

Facilitated by Shuang Li

 

11:25-11:30 Final comments & volunteer for a facilitator for the next session

We are planning our next sandpit session for May 2021.

Shuang Li
Universität Hamburg Hosts ULTRACEPT Sandpit Session - Robotic Grasping based on Deep Learning: Towards the Robust Perception
Hongzhuo Liang and Shuang Li discussing their research ideas with the attendees

More detailed information about Hongshuo Liang’s research publications and code can be found on his website.

To find out more about the fascinating work being carried out by the team at UHAM, check out their YouTube channel TAMS UHAM.

Dr Qinbing Fu Completes 12 Month Secondment in China

Dr Qinbing Fu received his PhD at University of Lincoln, in October 2018. Following a secondment under the STEP2DYNA project, Dr Fu carried out a further secondment under the ULTRACEPT project from August 2019 to August 2020 at partner Guangzhou University. Here he undertook research contributing to work packages 1 and 4. Dr Fu then went on to work as a postdoctoral researcher with Professor Shigang Yue until January 2021. Dr Fu’s ULTRACEPT contributions have involved directing the research into computational modelling of motion vision neural systems and applications on robotics. His research achievements and outputs for this project thus far is outlined in this blog post.

Research Outcomes

In support of the ULTRACEPT project, Dr Fu has published seven research papers including five journal papers and two conference papers. He was the first author on five of the publications and co-authored the other two. His main achievements have included:

  • The modelling of LGMD-1 and LGMD-2 collision perception neural network models with applications on robot and vehicle scenarios;
  • The modelling of Drosophila motion vision neural system for decoding the direction of foreground translating object in moving cluttered background;
  • A review on the related field of research;
  • Multiple neural system models integration for collision sensing.
Dr Qinbing Fu's ULTRACEPT research activities. Using the Colias robots for modelling work on collision detection visual systems.
Using the Colias robots for modelling work on collision detection visual systems.

Dr Fu’s research outputs can be found on his personal web pages on Google Scholar and ResearchGate. In addition, Qinbing directed promising research on building visually dynamic walls in an arena to test the on-board visual system. These research ideas have been collated and summarised in his research papers.

Dr Fu’s research contributions have fully supported ULTRACEPT’s WP1 and WP4. This includes modelling work on collision detection visual systems with systematic experiments on vehicle scenarios and also the integration of multiple neural system models for motion perception.

Using the Colias robots for modelling work on collision detection visual systems.
Using the Colias robots for modelling work on collision detection visual systems.

Secondment at Guangzhou University, China

Dr Fu carried out his ULTRACEPT secondment at project partner GZHU in China where he worked with Professor Jigen Peng. During this period he developed his capability on several aspects, becoming a more mature researcher in the academic community. This included: aspiring to progressive research ideas, collaboration with group members on completing research papers, coordinating teamwork, disseminating the project, good communication experience with global partners, and writing project proposals. Undoubtedly, the ULTRACEPT secondment for Dr Fu has been very successful.

Dissemination Activities

Dr Fu has undertaken a number of dissemination activities to promote the ULTRACEPT research outcomes. On the 28th July 2020, he presented his research at the ULTRACEPT online Workshop 2 on the topic of “Adaptive Inhibition Matters to Robust Collision Perception in Highly Variable Environments”. At this event, he exchanged ideas with project partners.

Qinbing Fu presents at ULTRACEPT Workshop 2
Dr Fu presents at ULTRACEPT Workshop 2

Dr Fu also facilitated a ULTRACEPT online Sandpit Session on 27 November 2020, during where he gave a talk on “Past, Present, and Future Modelling on Bio-Inspired Collision Detection Visual Systems: Towards the Robust Perception”.

Dr Fu presents at ULTRACEPT's First Sand Pit Session
Dr Fu presents at ULTRACEPT’s First Sand Pit Session

On 18th December 2020, Dr Fu attended the 2020 IEEE International Conference on Advanced Robotics and Mechatronics (ICARM) held in Shenzhen, China where he presented his research paper entitled “Complementary visual neuronal systems model for collision sensing”. He also chaired a Session on “Biomimetics” during this conference.

Qinbing Fu presents his research at the IEEE ARM 2020 Conference
Dr Qinbing Fu presents his research at the IEEE ARM 2020 Conference
Qinbing Fu presents his research at the IEEE ARM 2020 Conference
Qinbing Fu presents his research at the IEEE ARM 2020 Conference

Tian Liu Completes 12 Month Secondment at Guangzhou University

Tian Liu enrolled as a PhD Scholar at the University of Lincoln in 2018. In 2018-2019 he visited Guangzhou University as part of the STEP2DYNA project funded by the European Union’s Horizon 2020 Research and Innovation Programme under the Marie Skolodowska-Curie grant agreement. During this secondment Tian Liu developed the ColCOSΦ experiment platform for social insects and swarm robotic researching. Tian investigated how multiple virtual pheromones impact on the swarm robots. More recently, Tian completed a 12 month secondment under ULTRACEPT at Guangzhou University.

Tian Liu recently completed his second 12 month secondment at project partner Guangzhou University in China as part of the ULTRACEPT project funded by the European Union’s Horizon 2020 Research and Innovation Programme under the Marie Skolodowska-Curie grant agreement. Tian visited Guangzhou from November 2019 to November 2020 and has been involved in Work Package 1 and 4.

Tian reflects on what he has achieved during his time in Guangzhou

Most social insects, such as ants, only have a tiny brain. However, they can complete very difficult and complex tasks with a large number of individuals cooperating. Examples include building a large nest or collecting food through rugged routes. They are able to do this because the pheromones act as an important communication medium.

During this 12 month secondment, I continued to focus my attention on swarm robots with multiple pheromones. I believed that it is the interaction of multiple pheromones that enables insects to perform such demanding tasks, rather than the single pheromone mechanism which is now so widely studied. I worked with ULTRACEPT researcher Xuelong Sun and Dr. Cheng Hu to develop the ColCOSΦ, which can easily implement multiple pheromone research experiments. We verified the application and evaluation of the effects of multi-pheromones in swarm robotics by implementing several case studies which simulated ants foraging and carrying out hunting and deployment tasks.

I showcased the outcomes of this research at both ICARM2019 and ICARM2020 international conferences.

Tian Liu Completes 12 Month Secondment at Guangzhou University

Tian Liu presenting ICARM 2020
Tian Liu presenting ICARM 2020

Due to its excellent scalability, we also use it for research experiments in related fields. For example, the platform can simulate traffic scenarios so we can test our LGMD model (a collision detection model) by using the micro robot (Colias) in a low-cost way.

Tian Liu Completes 12 Month Secondment at Guangzhou University

Besides olfactory, the visual information is also a very important input for insects, so we implemented a changeable visual environment on the ColCOSΦ for investigating how to make full use of both olfactory and visual information in a swarm task. The research was collated into two articles which have been submitted to ICRA2021 with fellow ULTRACEPT researchers Xuelong Sun, Dr Qinbing Fu and Dr Cheng Hu.

Tian Liu Completes 12 Month Secondment at Guangzhou University

The secondment has been an excellent experience for me and my colleagues and provided me the opportunity to collaborate with my project colleagues.

Many thanks to ULTRACEPT project for supporting my research and for allowing me to work with these outstanding research scholars.