Category Archives: DISSEMINATION

Nikolas Andreakos Presents Paper in 30TH Annual Computational Neuroscience Meeting (CNS*2021)

Nikolas Andreakos is a PhD candidate at the University of Lincoln, who is working on developing computational models of associative memory formation and recognition in the mammalian hippocampus.

Recently Nikolas attended the 30th Annual Computational Neuroscience Meeting (CNS*2021). Due to the current travel restrictions, this year’s conference was moved online from 3rd to 7th of July 2021.

CNS 2021 online conference image

The purpose of the Organization for Computational Neurosciences is to create a scientific and educational forum for students, scientists, other professionals, and the general public to learn about, to share, contribute to, and advance the state of knowledge in computational neuroscience.

Computational neuroscience combines mathematical analyses and computer simulations with experimental neuroscience, to develop a principled understanding of the workings of nervous systems and apply it in a wide range of technologies.

The Organization for Computational Neurosciences promotes meetings and courses in computational neuroscience and organizes the Annual CNS Meeting which serves as a forum for young scientists to present their work and to interact with senior leaders in the field.

Poster Presentation

Nikolas presented his research Modelling the effects of perforant path in the recall performance of a CA1 microcircuit with excitatory and inhibitory neurons.

CNS 2021 online conference poster
Nikolas Andreakos CNS 2021 poster

Abstract

From recollecting childhood memories to recalling if we turn off the oven before we left the house, memory defines who we are. Losing it can be very harmful to our survival. Recently we quantitatively investigated the biophysical mechanisms leading to memory recall improvement of a computational CA1 microcircuit model of the hippocampus [1]. In the present study, we investigated the synergistic effects of the EC excitatory input (sensory input) and the CA3 excitatory input (contextual information) on the recall performance of the CA1 microcircuit. Our results showed that when the EC input was exactly the same as the CA3 input then the recall performance of our model was strengthened. When the two inputs were dissimilar (degree similarity: 40% – 0%), then the recall performance was reduced. These results were positively correlated with how many “active cells” represented a memory pattern. When the number of active cells increased and the degree of similarity between the two inputs decreased, then the recall performance of the model was reduced. The latter finding confirms previous results of ours where the number of cells coding a piece of information plays a significant role in the recall performance of our model.

References
1. Andreakos, N., Yue, S. & Cutsuridis, V. Quantitative investigation of memory recall performance of a computational microcircuit model of the hippocampus. Brain Inf 8, 9 (2021). https://doi.org/10.1186/s40708-021-00131-7

Nikolas Andreakos CNS 2021 poster presentation
Nikolas Andreakos CNS 2021 poster presentation

ULTRACEPT Researchers Present at IEEE ICRA 2021

The 2021 International Conference on Robotics and Automation (IEEE ICRA 2021) was held in Xi’an, China from 31st May to 4th June 2021. As one of the premier and top conferences in the field of robotics and automation, this great event has gathered thousands of excellent researchers from all over the world. Due to the pandemic, the conference was held in a hybrid format, including physical on-site and virtual cloud meetings. Four ULTRACEPT researchers attended this event, 3 in person and 1 online.

Proactive Action Visual Residual Reinforcement Learning for Contact-Rich Tasks Using a Torque-Controlled Robot

Yunlei Shi: Proactive Action Visual Residual Reinforcement Learning for Contact-Rich Tasks Using a Torque-Controlled Robot
Yunlei Shi: Proactive Action Visual Residual Reinforcement Learning for Contact-Rich Tasks Using a Torque-Controlled Robot

Agile Robots researcher Yunlei Shi attended ICRA 2021 online and presented his paper ‘Proactive Action Visual Residual Reinforcement Learning for Contact-Rich Tasks Using a Torque-Controlled Robot’.

Yunlei Shi is a full-time Ph.D. student at the Universität Hamburg and working at project partner Agile Robots, contributing to ULTRACEPT’s Work Package 4. In 2020 he visited Tsinghua University as part of the STEP2DYNA project.

Yunlei Shi presenting at ICRA 2020
Yunlei Shi presenting online at ICRA 2020

Yunlei presented his conference paper:

Yunlei Shi, Zhaopeng Chen, Hongxu Liu, Sebastian Riedel, Chunhui Gao, Qian Feng, Jun Deng, Jianwei Zhang, “Proactive Action Visual Residual Reinforcement Learning for Contact-Rich Tasks Using a Torque-Controlled Robot”, (ICRA) 2021, Xi’ an, China.

Abstract

Contact-rich manipulation tasks are commonly found in modern manufacturing settings. However, manually designing a robot controller is considered hard for traditional control methods as the controller requires an effective combination of modalities and vastly different characteristics. In this paper, we first consider incorporating operational space visual and haptic information into a reinforcement learning (RL) method to solve the target uncertainty problems in unstructured environments. Moreover, we propose a novel idea of introducing a proactive action to solve a partially observable Markov decision process (POMDP) problem. With these two ideas, our method can either adapt to reasonable variations in unstructured environments or improve the sample efficiency of policy learning. We evaluated our method on a task that involved inserting a random-access memory (RAM) using a torque-controlled robot and tested the success rates of different baselines used in the traditional methods. We proved that our method is robust and can tolerate environmental variations.

Representation of policies and controller scheme. The blue region is the real-time controller, and the wheat region is the non-real-time trained policy.
Representation of policies and controller scheme. The blue region is the real-time controller, and the wheat region is the non-real-time trained policy.

More details about this paper can be viewed in this video on the Universität Hamburg’s Technical Aspects of Multimodal Systems (TAMS) YouTube channel.

Yunlei was very happy to attend this fantastic conference with support from the project ULTRACEPT.

A Versatile Vision-Pheromone-Communication Platform for Swarm Robotics

Three researchers from the University of Lincoln; Tian Liu, Xuelong Sun, and Qinbing Fu, attended ICRA 2021 in person to present their co-authored paper, ‘A Versatile Vision-Pheromone-Communication Platform for Swarm Robotics’. 

ULTRACEPT researchers Tian Liu, Xuelong Sun and Qinbing Fu attending ICRA 2021
ULTRACEPT researchers Tian Liu, Xuelong Sun and Qinbing Fu attending ICRA 2021

We three were very happy to physically attend this fantastic conference with the support from the project ULTRACEPT.

We have one co-authored paper that presents our developed vision-pheromone-communication platform which was published in the proceedings of this conference. Tian Liu delivered the presentation which outlined our platform and it attracted some attention of attendees through interesting questions asked by the audience. We think that this event has provided us a great opportunity to raise publicity about our platform for future swarm robotics and social insects studies.

Tian Liu presenting at ICRA 2021
Tian Liu presenting at ICRA 2021

A Versatile Vision-Pheromone-Communication Platform for Swarm Robotics, Tian Liu, Xuelong Sun, Cheng Hu, Qinbing Fu, and Shigang Yue, University of Lincoln

Keywords: Biologically-Inspired Robots, Multi-Robot Systems, Swarm Robotics

Abstract: This paper describes a versatile platform for swarm robotics research. It integrates multiple pheromone communication with a dynamic visual scene along with real-time data transmission and localization of multiple-robots. The platform has been built for inquiries into social insect behavior and bio-robotics. By introducing a new research scheme to coordinate olfactory and visual cues, it not only complements current swarm robotics platforms which focus only on pheromone communications by adding visual interaction but also may fill an important gap in closing the loop from bio-robotics to neuroscience. We have built a controllable dynamic visual environment based on our previously developed ColCOSPhi (a multi-pheromones platform) by enclosing the arena with LED panels and interacting with the micro mobile robots with a visual sensor. In addition, a wireless communication system has been developed to allow the transmission of real-time bi-directional data between multiple micro robot agents and a PC host. A case study combining concepts from the internet of vehicles (IoV) and insect-vision inspired model has been undertaken to verify the applicability of the presented platform and to investigate how complex scenarios can be facilitated by making use of this platform.

We have grasped many interesting ideas and inspirations from colleagues in the robotics field from not only the excellent talks but also high-quality robots’ exhibitions from famed companies in the industry.

Conference presentations at ICRA 2021
Conference presentations attended by the researchers at ICRA 2021
Demonstration at the ICRA 2021 conference
Demonstration at the ICRA 2021 conference

On the last day of the conference, we attended a wonderful tour of the Shaanxi History Museum and the Terra-Cotta Warriors, from which we have leaned a lot about the impressive history and culture of Qin dynasty. Further, this also makes us rethink the important role played by science and technology in assisting archaeological excavation and cultural relic protection.

Thanks to the supportive ULTRACEPT project, we really enjoyed the whole event bringing us not only new knowledge about the robotics and history, but enlightening inspirations which will potentially motivate our future researches. In addition, our group’s researching works also have been propagated via this top international conference.

Qian Feng: Centre-of-Mass-based Robust Grasp Planning for Unknown Objects, Using Tactile-Visual Sensors

Qian Feng is an external PhD student at the Technical University of Munich and working at project partner Agile Robots and contributing to ULTRACEPT’s Work Package 4.

The IEEE International Conference on Robotics and Automation (ICRA) is an annual academic conference covering advances in robotics. It is one of the premier conferences in its field, with an ‘A’ rating from the Australian Ranking of ICT Conferences obtained in 2010 and an ‘A1’ rating from the Brazilian ministry of education in 2012.

Qian Feng attended the IEEE International Conference on Robotics and Automation (ICRA) 2020. The conference was originally scheduled to take place in Paris, France, but due to COVID-19, the conference was held virtually from 31 May 2020 until 31 August 2020.

Qian Feng ULTRACEPT IEEE Conference
Qian Feng presenting online at ICRA 2020

Qian presented his conference paper:

Q. Feng, Z. Chen, J. Deng, C. Gao, J. Zhang and A. Knoll, Center-of-Mass-based Robust Grasp Planning for Unknown Objects Using Tactile-Visual Sensors,” 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 2020, pp. 610-617, doi: 10.1109/ICRA40945.2020.9196815.

Abstract

An unstable grasp pose can lead to slip, thus an unstable grasp pose can be predicted by slip detection. A re-grasp is required afterward in order to correct the grasp pose and finish the task. In this work, we propose a novel re-grasp planner with multi-sensor modules to plan grasp adjustments with the feedback from a slip detector. Then a re-grasp planner is trained to estimate the location of centre of mass, which helps robots find an optimal grasp pose. The dataset in this work consists of 1,025 slip experiments and 1,347 re-grasps collected by one pair of tactile sensors, an RGB-D camera, and one Franka Emika robot arm equipped with joint force/torque sensors. We show that our algorithm can successfully detect and classify the slip for 5 unknown test objects with an accuracy of 76.88% and a re-grasp planner increases the grasp success rate by 31.0%, compared to the state-of-the-art vision-based grasping algorithm.

Qian Feng ULTRACEPT IEEE Conference slip detector
Qian Feng: Slip Detector
Qian Feng ULTRACEPT IEEE Conference Grasp Success Rate on Test Objects
Qian Feng: Grasp Success Rate on Test Objects

 

When asked about his experience presenting and attending ICRA 2020, Qian said:

“Thanks to the virtual conference we were still able to present our work. It also meant that more people were able to join the conference to learn about and discuss our research. Everyone was able to access the presentation and get involved in the discussion in the virtual conference for 2 months, instead of the originally scheduled 5 minutes of discussion for the on-site conference. During this conference I shared my work with many researchers from the same field and exchanged ideas. I really enjoyed the conference and learnt a lot from the other attendees.”

UHAM Researchers Present at the International Conference on Intelligent Robots and Systems

Shuang Li is a fourth-year PhD student in Computer Science at Universität Hamburg. Her research interests are dexterous manipulation, vision-based teleoperation imitation learning in robotics. Shuang has been working on the project Transregio SFB “Cross-modal learning” and is involved in the ULTRACEPT Work Package 4. Shuang is the course leader of ‘Introduction to Robotics’.

Hongzhuo Liang is a fifth-year PhD student in Computer Science at Universität Hamburg. His research interests are robotic grasping manipulation based on multimodel perception. Hongzhuo has been working on the project Transregio SFB “Cross-modal learning” for STEP2DYNA (691154) and ULTRACEPT Work Package 4.

The IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) is one of the largest and most impacting robotics research conferences worldwide. Established in 1988 and held annually, IROS provides an international forum for the international robotics research community to explore the frontier of science and technology in intelligent robots and smart machines.

Researchers Shuang Li and Hongzhuo Liang from ULTRACEPT partner the Universität of Hamburg,  attended and presented at IROS 2020. In addition to technical sessions and multi-media presentations, the IROS conference also held panel discussions, forums, workshops, tutorials, exhibits, and technical tours to enrich the fruitful discussions among conference attendees.

Due to COVID-19, the conference was hosted online with free access to every Technical Talk, Plenary, and Keynote and over sixty Workshops, Tutorials and Competitions. This went online on 24th October 2020 and was available until 24th January 2021.

A Mobile Robot Hand-Arm Teleoperation System by Vision and IMU

 

Shuang Li Introduction to Robotics ULTRACEPT Work Package 4
Shuang Li presenting ‘A Moble Robot Hand-Arm Teleoperation System by Vision and IMU

 

At IROS 2020, Shuang Li presented her conference paper:

S. Li et al., “A Mobile Robot Hand-Arm Teleoperation System by Vision and IMU,” 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 2020, pp. 10900-10906, doi: 10.1109/IROS45743.2020.9340738.

Video footage of Shuang’s work can be viewed on the UHAM Technical Aspects of Multimodal Systems (TAMS) YouTube channel.

Abstract

In this paper, we present a multimodal mobile teleoperation system that consists of a novel vision-based hand pose regression network (Transteleop) and an IMU (inertial measurement units) based arm tracking method. Transteleop observes the human hand through a low-cost depth camera and generates not only joint angles but also depth images of paired robot hand poses through an image-to-image translation process. A key-point based reconstruction loss explores the resemblance in appearance and anatomy between human and robotic hands and enriches the local features of reconstructed images. A wearable camera holder enables simultaneous hand-arm control and facilitates the mobility of the whole teleoperation system. Network evaluation results on a test dataset and a variety of complex manipulation tasks that go beyond simple pick-and-place operations show the efficiency and stability of our multimodal teleoperation system.

Further information about this paper, including links to the code can be found here.

Robust Robotic Pouring using Audition and Haptics

 

Hongzhuo Liang Robust Robust Robotic Pouring using Audition and Haptics ULTRACEPT Work Package 4
Hongzhuo Liang presenting Robust Robust Robotic Pouring using Audition and Haptics

 

At IROS 2020, Hongzhuo Liang presented his conference paper:

H. Liang et al., “Robust Robotic Pouring using Audition and Haptics,” 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 2020, pp. 10880-10887, doi: 10.1109/IROS45743.2020.9340859.

Video footage of Hongzhuo’s work can be viewed on the UHAM Technical Aspects of Multimodal Systems (TAMS) YouTube channel.

Abstract

Robust and accurate estimation of liquid height lies as an essential part of pouring tasks for service robots. However, vision-based methods often fail in occluded conditions, while audio-based methods cannot work well in a noisy environment. We instead propose a multimodal pouring network (MP-Net) that is able to robustly predict liquid height by conditioning on both audition and haptics input. MP-Net is trained on a self-collected multimodal pouring dataset. This dataset contains 300 robot pouring recordings with audio and force/torque measurements for three types of target containers. We also augment the audio data by inserting robot noise. We evaluated MP-Net on our collected dataset and a wide variety of robot experiments. Both network training results and robot experiments demonstrate that MP-Net is robust against noise and changes to the task and environment. Moreover, we further combine the predicted height and force data to estimate the shape of the target container.

Further information about this paper, including links to the code can be found here.

Yannick Jonetzko presents a paper in International Conference on Cognitive Systems and Information Processing 2020 (ICCSIP)

Yannick Jonetzko is a PhD candidate at the Universität Hamburg working on the usage of tactile sensors in multimodal environments. In 2018 he visited the Tsinghua University as part of the STEP2DYNA project and is now involved in the ULTRACEPT project and contributing to Work Package 4.

The International Conference on Cognitive Systems and Information Processing 2020 (ICCSIP) took place on  25th – 27th  December 2020 and was attended by ULTRACEPT researcher Yannick Jonetzko from project partner the Universität Hamburg. Due to the current travel restrictions, the conference was held online and Yannick’s work was presented via a pre-recorded video.

In the past few years, ICCSIP has matured into a well-established series of international conferences on cognitive information processing and related fields over the world. At their 2020 conference, over 60 researchers presented their work in multiple sessions on algorithms, applications, vision, manipulation, bioinformatics, and autonomous vehicles.

Yannick presented his conference paper Multimodal Object Analysis with Auditory and Tactile Sensing using Recurrent Neural Networks.

Abstract

Robots are usually equipped with many different sensors that need to be integrated. While most research is focused on the integration of vision with other senses, we successfully integrate tactile and auditory sensor data from a complex robotic system. Herein, we train and evaluate a neural network for the classification of the content of eight optically identical medicine containers. To investigate the relevance of the tactile modality in classification under realistic conditions, we apply different noise levels to the audio data. Our results show significantly higher robustness to acoustic noise with the combined multimodal network than with the unimodal audio-based counterpart.

ULTRACEPT Researchers Attend IEEE ICARM 2020 Conference

The IEEE International Conference on Advanced Robotics and Mechatronics (ICARM) 2020 was held in Shenzhen, China, and attended by three University of Lincoln’s (UoL) ULTRACEPT researchers; Dr Qinbing Fu, Xuelong Sun, and Tian Liu. These researchers completed 12 month ULTRACEPT secondments with project partner Guangzhou University (GZHU) in China.

Tian Liu, Qinbing Fu, and Xuelong Sun attend IEEE ARM 2020 Conference
L to R: Tian Liu, Qinbing Fu, and Xuelong Sun attend IEEE ARM 2020 Conference

The IEEE ARM Conference took place between the 18th and 21st December 2020 and is the flagship conference on bio-mechatronics and bio-robotics systems as well as neuro-robotics systems. The conference provides an international forum for researchers, educators, engineers in general areas of mechatronics, robotics, automation, and sensors to disseminate their latest research results and exchange views on the future research directions of these fields.

The UoL researchers attended to promote their publications produced as part of both the European Union’s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant agreements STEP2DYNA (691154) and ULTRACEPT (778062) project.

Dr Qinbing Fu: Complementary Visual Neuronal Systems Model for Collision Sensing

Dr Qinbing Fu presented his research paper entitled “Complementary Visual Neuronal Systems Model for Collision Sensing” which was included in the conference proceedings on Monday morning. Dr Fu was also the chair of MoSHT3 Regular Session, based on the topic of Biomimetics.

Qinbing Fu presents his research at the IEEE ARM 2020 Conference
Qinbing Fu presents his research at the IEEE ARM 2020 Conference

Q. Fu and S. Yue, “Complementary Visual Neuronal Systems Model for Collision Sensing,” 2020 5th International Conference on Advanced Robotics and Mechatronics (ICARM), Shenzhen, China, 2020, pp. 609-615, doi: 10.1109/ICARM49381.2020.9195303.

Abstract – Inspired by insects’ visual brains, this paper presents original modelling of a complementary visual neuronal systems model for real-time and robust collision sensing. Two categories of wide-field motion sensitive neurons, i.e., the lobula giant movement detectors (LGMDs) in locusts and the lobula plate tangential cells (LPTCs) in flies, have been studied, intensively. The LGMDs have specific selectivity to approaching objects in depth that threaten collision; whilst the LPTCs are only sensitive to translating objects in horizontal and vertical directions. Though each has been modelled and applied in various visual scenes including robot scenarios, little has been done on investigating their complementary functionality and selectivity when functioning together. To fill this vacancy, we introduce a hybrid model combining two LGMDs (LGMD-1 and LGMD-2) with horizontally (rightward and leftward) sensitive LPTCs (LPTC-R and LPTC-L) specialising in fast collision perception. With coordination and competition between different activated neurons, the proximity feature by frontal approaching stimuli can be largely sharpened up by suppressing translating and receding motions. The proposed method has been implemented in ground micro-mobile robots as embedded systems. The multi-robot experiments have demonstrated the effectiveness and robustness of the proposed model for frontal collision sensing, which outperforms previous single-type neuron computation methods against translating interference.

Qinbing Fu presents his research at the IEEE ARM 2020 Conference
Dr Qinbing Fu presents his research at the IEEE ARM 2020 Conference

When asked about the conference experience Dr Fu said;

2020 has been a very tough year for everyone around the world. The pandemic has absolutely affected people’s lives. As an academic researcher, it has become more difficult to exchange ideas closely with other colleagues. Almost all of the academic conferences across every discipline has moved to on-line presenting. This has made it challenging to disseminate research and exchange ideas.

China was suffering from the pandemic in early 2020. However, due to its successful control of COVID-19, after June 2020 most parts of life, including work, had returned to normal. As a result, the conference was successfully held in person as originally planned, although international guests were not able to attend due to travel restrictions.

The conference attendees appreciated how well the conference was organised in Shenzhen. Personally, I very much enjoyed attending this conference. Due to travel restrictions, it was not a large conference, but every detail was considered and arranged properly. There were many enjoyable moments and I learnt alot. The plenary presentations were very high quality. Another special, memorable experience for me was the opportunity to chair a session for the first time during the conference. It was awesome!

Xuelong Sun: Fabrication and Mechanical Analysis of Bioinspired Gliding-optimized Wing Prototypes for Micro Aerial Vehicles

Xuelong Sun presented his co-authored paper ‘Fabrication and Mechanical Analysis of Bioinspired Gliding-optimized Wing Prototypes for Micro Aerial Vehicles‘ as lead author, Huatian Isakhani, was unable to attend due to travel restrictions. Their paper was awarded Best Conference Paper Finalist.

Best conference paper finalist at the IEEE ARM 2020 Conference
Best conference paper finalist at the IEEE ARM 2020 Conference

H. Isakhani, S. Yue, C. Xiong, W. Chen, X. Sun and T. Liu, “Fabrication and Mechanical Analysis of Bioinspired Gliding-optimized Wing Prototypes for Micro Aerial Vehicles,” 2020 5th International Conference on Advanced Robotics and Mechatronics (ICARM), Shenzhen, China, 2020, pp. 602-608, doi: 10.1109/ICARM49381.2020.9195392.

Abstract – Gliding is the most efficient flight mode that is explicitly appreciated by natural fliers. This is achieved by high-performance structures developed over millions of years of evolution. One such prehistoric insect, locust (Schistocerca gregaria) is a perfect example of a natural glider capable of endured transatlantic flights, which could potentially inspire numerous solutions to the problems in aerospace engineering. However, biomimicry of such aerodynamic properties is hindered by the limitations of conventional as well as modern fabrication technologies in terms of precision and availability, respectively. Therefore, we explore and propose novel combinations of economical manufacturing methods to develop various locust-inspired tandem wing prototypes (i.e. fore and hindwings), for further wind tunnel based aerodynamic studies. Additionally, we determine the flexural stiffness and maximum deformation rate of our prototypes and compare it to their counterparts in nature and literature, recommending the most suitable artificial bioinspired wing for gliding micro aerial vehicle applications.

Xuelong Sun presents his research at the IEEE ARM 2020 Conference
Xuelong Sun presents his research at the IEEE ARM 2020 Conference

When asked about the conference Xuelong said;

This has been a fantastic conference, although we are getting through this special year. The keynote speakers delivered very impressive talks concerning Controlling System, AI, and robotics, which offered great food for thought. I was very pleased that our paper was shortlisted for the BEST STUDENTS PAPER AWARD.

In my presentation, I reported on the work we have completed on manufactured bio-inspired wings for future flying robots which mimick locust. We emphasised that the methods applied are affordable, and that the manufactured wings feature high flexibility and rigidity. Although we didn’t win the award, we were finalist, which has encouraged us to keep moving forward with our future research.

Tian Liu: Investigating Multiple Pheromones in Swarm Robots – A Case Study of Multi-Robot Deployment

Tian Lu presented his paper ‘Investigating Multiple Pheromones in Swarm Robots – A Case Study of Multi-Robot Deployment.’

T. Liu, X. Sun, C. Hu, Q. Fu, H. Isakhani and S. Yue, “Investigating Multiple Pheromones in Swarm Robots – A Case Study of Multi-Robot Deployment,” 2020 5th International Conference on Advanced Robotics and Mechatronics (ICARM), Shenzhen, China, 2020, pp. 595-601, doi: 10.1109/ICARM49381.2020.9195311

Abstract – Social insects are known as the experts in handling complex task in a collective smart way although their small brains contain only limited computation resources and sensory information. It is believed that pheromones play a vital role in shaping social insects’ collective behaviours. One of the key points underlying the stigmergy is the combination of different pheromones in a specific task. In the swarm intelligence field, pheromone inspired studies usually focus one single pheromone at a time, so it is not clear how effectively multiple pheromones could be employed for a collective strategy in the real physical world. In this study, we investigate multiple pheromone-based deployment strategy for swarm robots inspired by social insects. The proposed deployment strategy uses two kinds of artificial pheromones; the attractive and the repellent pheromone that enables micro robots to be distributed in desired positions with high efficiency. The strategy is assessed systematically by both simulation and real robot experiments using a novel artificial pheromone platform ColCOSΦ. Results from the simulation and real robot experiments both demonstrate the effectiveness of the proposed strategy and reveal the role of multiple pheromones. The feasibility of the ColCOSΦ platform, and its potential for further robotic research on multiple pheromones are also verified. Our study of using different pheromones for one collective swarm robotics task may help or inspire biologists in real insects’ research.

Tian Liu presents his research at the IEEE ARM 2020 Conference
Tian Liu presents his research at the IEEE ARM 2020 Conference

When asked about the conference Tian Lu said;

In the pandemic of COVID-19, this conference is a rare opportunity to listen to the keynote speakers presentations about control, artificial intelligence, and bioinspiration psychically. I also presented my own research about multiple pheromones and experiment system ColCOSΦ, and had a friendly exchange with scholars in related fields. I believe this conference has enabled more people to learn about our research progress and results.

Siavash Bahrami presents at ICIMT 2020

Siavash Bahrami is a PhD candidate at Universiti Putra Malaysia (UPM), who is working on multimodal deep neural networks using acoustic and visual data for developing an active road safety system intended for autonomous and semi-autonomous vehicles. As part of the ULTRACEPT project working on work package 2, Siavash recently completed a 6 month secondment at the University of Lincoln and another at Visomorphic LTD.

Siavash Bahrami presents at ICIMT 2020
Siavash presenting at ICIMT 2020

Recently Siavash presented a paper titled “Acoustic Feature Analysis for Wet and Dry Road Surface Classification Using Two-stream CNN” during the 12th International Conference on Information and Multimedia Technology (ICIMT 2020). The data utilised for training and testing the proposed CNN architectures were collected during Siavash’s secondment in the UK. Despite the strains caused by the global pandemic Siavash managed to complete his secondment and collect all the data that was needed for his PhD thesis and the ULTRACEPT project with the help of UoL and UPM project members.

Siavash Bahrami presents at ICIMT 2020
Siavash presenting at ICIMT 2020

ICIMT 2020 was scheduled to take place between 11th and 13th December 2020 in Zhuhai, China, but due to the pandemic it was instead held as a virtual conference. The aim of ICIMT is to provide a platform for researchers, engineers, academics and industrial professionals from all over the world to present their research results and development activities in Information and Multimedia Technology. This conference provides opportunity for delegates to exchange new ideas and applications to establish business or research relations and to find global partners for future collaboration.

Professor Shigang Yue (ULTRACEPT Project Coordinator) and Associate Professor Dr. Shyamala Doraisamy (ULTRACEPT Project partner Lead for UPM) each chaired one of the conference sessions.

Siavash Bahrami presents at ICIMT 2020
Siavash presentation at ICIMT 2020

Acoustic Feature Analysis for Wet and Dry Road Surface Classification Using Two-stream CNN – Abstract

Road surface wetness affects road safety and is one of the main reasons for weather-related accidents. Study on road surface classification is not only vital for future driverless vehicles but also important to the development of current vehicle active safety systems. In recent years, studies on road surface wetness classification using acoustic signals have been on the rise. Detection of road surface wetness from acoustic signals involve analysis of signal changes over time and frequency-domain caused by interaction of the tyre and the wet road surface to determine the suitable features. In this paper, two single stream CNN architectures have been investigated. The first architecture uses MFCCs and the other uses temporal and spectral features as the input for road surface wetness detection. A two-stream CNN architecture that merges the MFCCs and spectral feature sets by concatenating the outputs of the two streams is proposed for further improving classification performance of road surface wetness detection. Acoustic signals of wet and dry road surface conditions were recorded with two microphones instrumented on two different cars in a controlled environment. Experimentation and comparative performance evaluations against single stream architectures and the two-stream architecture were performed. Results shows that the accuracy performance of the proposed two-stream CNN architecture is significantly higher compared to single stream CNN for road surface wetness detection.

The team at UPM recording road sounds
The team at UPM recording road sounds

Read more about Siavash’s ULTRACEPT work in his blog post here.

Yair Barnatan from UBA attends XXXV Annual Meeting of the Argentinian Society for Neuroscience Research

Yair Barnatan is an ULTRACEPT PhD student, at the University of Buenos Aires, working in the field of neuroethology. He is currently focused on neuronal processing of optic flow in the crustacean visual system, unravelling which and how neurons are involved in this process.

Yair attended the XXXV Annual Meeting of the Argentinian Society for Neuroscience Research, SAN 2020. This event was held virtually due to the global pandemic from 7th to 9th October, 2020.

This congress covered a wide variety of neuroscience topics, such as sensory and motor systems, neurodegenerative diseases and learning and memory. In that meeting, Yair presented a poster entitled “Functional evidence of the crustacean lobula plate as optic flow processing center” (Barnatan, Y., Tomsic, D. & Sztarker, J.)

Yair Barnatan from UBA attends XXXV Annual Meeting of the Argentinian Society for Neuroscience Research
Yair Barnatan from UBA attends XXXV Annual Meeting of the Argentinian Society for Neuroscience Research

Abstract

When an animal rotates it produces wide field image motion over its retina, termed optic flow (OF). OF blurs the image compromising the ability to see. Image shifts are stabilized by compensatory behaviors collectively termed optomotor response (OR). In most vertebrates and decapod crustaceans such reflex behavior involves mainly eye movements that consists in a slow tracking phase of the wide field image motion followed by a fast-resetting phase. We used the mud crab Neohelice granulata to tackle a major question in crustacean’s visual processing: which region of the brain is the neural substrate for processing OF? It has long been known that dipteran lobula plate (3rd optic neuropil) is the center involved in processing OF information. Recently, a crustacean lobula plate was characterized by neuroanatomical techniques, sharing many canonical features with the dipteran neuropil. In this work we present a functional evaluation of the role of crab’s lobula plate on the compensatory eye movements to rotational OF by performing electrolytic lesion experiments. We show that lesioning the lobula plate greatly impairs OR while keeping intact other visually guided behaviors, such as avoidance response upon an approaching stimulus. Even when OR is present in some lobula plate lesioned animals, these show reduced speed of eye tracking. Altogether, these results present strong evidence about an evolutionary conserved site for processing optic flow shared by crustacean and insects.

Yair Barnatan from UBA attends XXXV Annual Meeting of the Argentinian Society for Neuroscience Research

Nikolas Andreakos Presents Paper in 13th International Conference on Brain Informatics (BI2020)

Nikolas Andreakos is a PhD candidate at the University of Lincoln, who is working on developing computational models of associative memory formation and recognition in the mammalian hippocampus.

Recently Nikolas attended the 13th International Conference on Brain Informatics (BI2020). Due to the current travel restrictions, this year’s conference, which was scheduled to take place on 19th September 2020 in Padova, Italy, was moved online.

13th International Conference on Brain Informatics (BI 2020)

About Brain Informatics 2020

The Brain Informatics (BI) conference series has established itself as the world’s premier research forum on Brain Informatics, which is an emerging interdisciplinary and multidisciplinary research field with joint efforts from neuroscience, cognitive science, medicine and life sciences, data science, artificial intelligence, neuroimaging technologies, and information and communication technologies.

The 13th International Conference on Brain Informatics (BI2020) provided a premier international forum to bring together researchers and practitioners from diverse fields for presentation of original research results, as well as exchange and dissemination of innovative and practical development experiences on Brain Informatics research, brain-inspired technologies and brain/mental health applications.

The theme of BI2020 was: Brain Informatics in the Virtual World.

The BI2020 solicits high-quality original research and application papers (both full paper and abstract submissions). Relevant topics included but were not limited to:

  • Track 1: Cognitive and Computational Foundations of Brain Science
  • Track 2: Human Information Processing Systems
  • Track 3: Brain Big Data Analytics, Curation and Management
  • Track 4: Informatics Paradigms for Brain and Mental Health Research
  • Track 5: Brain-Machine Intelligence and Brain-Inspired Computing

Nikolas presented his research Andreakos N., Yue S., Cutsuridis V. (2020) Recall Performance Improvement in a Bio-Inspired Model of the Mammalian Hippocampus. In: Mahmud M., Vassanelli S., Kaiser M.S., Zhong N. (eds) Brain Informatics. BI 2020. Lecture Notes in Computer Science, vol 12241. Springer, Cham. https://doi.org/10.1007/978-3-030-59277-6_29.

Nikolas Andreakos Presents Paper in 13th International Conference on Brain Informatics (BI 2020) Nikolas Andreakos Presents Paper in 13th International Conference on Brain Informatics (BI 2020)

Abstract

Mammalian hippocampus is involved in short-term formation of declarative memories. We employed a bio-inspired neural model of hippocampal CA1 region consisting of a zoo of excitatory and inhibitory cells. Cells’ firing was timed to a theta oscillation paced by two distinct neuronal populations exhibiting highly regular bursting activity, one tightly coupled to the trough and the other to the peak of theta. To systematically evaluate the model’s recall performance against number of stored patterns, overlaps and ‘active cells per pattern’, its cells were driven by a non-specific excitatory input to their dendrites. This excitatory input to model excitatory cells provided context and timing information for retrieval of previously stored memory patterns. Inhibition to excitatory cells’ dendrites acted as a non-specific global threshold machine that removed spurious activity during recall. Out of the three models tested, ‘model 1’ recall quality was excellent across all conditions. ‘Model 2’ recall was the worst. The number of ‘active cells per pattern’ had a massive effect on network recall quality regardless of how many patterns were stored in it. As ‘active cells per pattern’ decreased, network’s memory capacity increased, interference effects between stored patterns decreased, and recall quality improved. Key finding was that increased firing rate of an inhibitory cell inhibiting a network of excitatory cells has a better success at removing spurious activity at the network level and improving recall quality than increasing the synaptic strength of the same inhibitory cell inhibiting the same network of excitatory cells, while keeping its firing rate fixed.

Nikolas Andreakos Presents Paper in 13th International Conference on Brain Informatics (BI 2020)

When asked about his experience, Nikolas said:

“I really enjoyed the conference and learned a lot. It was a valuable and absorbing experience for me. The atmosphere was friendly. I shared my research and my experience with other attendants, and exchange ideas which would help me to improve my existing work”.

Nikolas Andreakos Presents a Poster in the 9th International Conference on Biomimetic and Biohybrid Systems 2020

Nikolas Andreakos is a PhD candidate at the University of Lincoln who is working on developing computational models of associative memory formation and recognition in the mammalian hippocampus.

Recently, Nikolas attended the 9th International Conference on Biomimetic and Biohybrid Systems, Living Machines 2020 (LM2020). This year’s conference which was scheduled to take place between 28th and 30th July 2020 in Freiburg, Germany, but due to the current situation around Covid-19, it was moved online.9th international conference on biomimetic and biohybrid systems 2020

Andreakos N., Yue S., Cutsuridis V. (2020) Improving Recall in an Associative Neural Network Model of the Hippocampus. In: Vouloutsi V., Mura A., Tauber F., Speck T., Prescott T.J., Verschure P.F.M.J. (eds) Biomimetic and Biohybrid Systems. Living Machines 2020. Lecture Notes in Computer Science, vol 12413. Springer, Cham. https://doi.org/10.1007/978-3-030-64313-3_1

ABOUT LIVING MACHINES 2020

The development of future real-world technologies will depend strongly on our understanding and harnessing of the principles underlying living systems and the flow of communication signals between living and artificial systems.

Biomimetics is the development of novel technologies through the distillation of principles from the study of biological systems. The investigation of biomimetic systems can serve two complementary goals. First, a suitably designed and configured biomimetic artefact can be used to test theories about the natural system of interest. Second, biomimetic technologies can provide useful, elegant and efficient solutions to unsolved challenges in science and engineering.

Biohybrid systems are formed by combining at least one biological component—an existing living system—and at least one artificial, newly-engineered component. By passing information in one or both directions, such a system forms a new hybrid bio-artificial entity. The theme of the conference also encompasses biomimetic methods for manufacture, repair and recycling inspired by natural processes such as reproduction, digestion, morphogenesis andmetamorphosis.

The following are some examples of “Living Machines” as featured at past conferences:

  • Biomimetic robots and their component technologies (sensors, actuators, processors) that can intelligently interact with their environments.
  • Biomimetic computers neuromimetic emulations of the physiological basis for intelligent behaviour.
  • Active biomimetic materials and structures that self-organise and self-repair or show other bio-inspired functions.
  • Nature inspired designs and manufacturing processes.
  • Biohybrid brain-machine interfaces and neural implants.
  • Artificial organs and body-parts including sensory organ-chip hybrids and intelligent prostheses.
  • Organism-level biohybrids such as robot-animal or robot-human systems.

Nikolas presented his research Improving recall in an associative neural network model of the hippocampus.

Accepted papers 9th international conference on biomimetic and biohybrid systems 2020

Living Machines 2020 presentation

Living Machines 2020 presentation

Abstract

The mammalian hippocampus is involved in auto-association and hetero-association of declarative memories. We employed a bio-inspired neural model of hippocampal CA1 region to systematically evaluate its mean recall quality against different number of stored patterns, overlaps and active cells per pattern. Model consisted of excitatory (pyramidal cells) and four types of inhibitory cells: axo-axonic, basket, bistratified, and oriens lacunosum-moleculare cells. Cells were simplified compartmental models with complex ion channel dynamics. Cells’ firing was timed to a theta oscillation paced by two distinct neuronal populations exhibiting highly regular bursting activity, one tightly coupled to the trough and the other to the peak of theta. During recall excitatory input to network excitatory cells provided context and timing information for retrieval of previously stored memory patterns. Dendritic inhibition acted as a non-specific global threshold machine that removed spurious activity during recall. Simulations showed recall quality improved when the network’s memory capacity increased as the number of active cells per pattern decreased. Furthermore, increased firing rate of a presynaptic inhibitory threshold machine inhibiting a network of postsynaptic excitatory cells has a better success at removing spurious activity at the network level and improving recall quality than increased synaptic efficacy of the same threshold machine on the same network of excitatory cells, while keeping its firing rate fixed.

Nikolas Andreakos attending the 9th international conference on biomimetic and biohybrid systems 2020
Nikolas Andreakos attending the 9th international conference on biomimetic and biohybrid systems 2020