University of Lincoln Masters researcher Mu Hua, recently completed a 12 month secondment for ULTRACEPT at partner Guangzhou University in China.
During my one-year secondment at Guangzhou University, my research was based on the previous works on the locusts LGMD (Lobula Giant Movement Detector) neural networks for collision perception, including the LGMD1 from Prof Shigang Yue and LGMD2 from Dr Qinbing Fu. My work was mainly focused on improving the LGMDs neural network’s ability for ultra-fast approaching objects.
Benefiting from thousands of decades’ evolution, locusts have been equipped with a vision system that improves their success rate of evading their natural predators in the blink of an eye. Taking inspiration from nature through the computational models of LGMDs in locust’s visual pathways has had a positive impact on addressing these problems. However, it is still challenging for current LGMD neural networks to accurately and reliably recognize the imminent collision when the approaching object is ultra-fast (see Fig. 1). The green dashed line is the threshold we set to indicate whether collision is happening or not; the Blue curve is the current LGMD1 responses to the ultra-fast objects. The neuron fires spikes and generates a ‘false alert’ while the approaching black ball is far away.
Since the refractoriness, namely the refractory period which is a common mechanism within plenty of creatures’ neuron systems, is able to assist together with other sorts of mechanisms to help stabilize a neuron. It is then introduced to previous works of LGMDs neural networks for further improvement. On the left in Figure 2, we show a comparison between our new proposed LGMD1 neural network and the previous one from Shigang Yue. On the right, we demonstrate the comparison between our proposed method of LGMD2 and the previous one from Qinbing Fu.
To better understand the refractoriness mechanism and explain the rationality of integrating the LGMDs neural network, we sought guidance from professors, Prof. Jigen Peng and Prof. Huang, and our outstanding colleagues. Their inference from the perspective of mathematics supported the proposed method (see Fig.3).
During my secondment, I obtained knowledge on both bio-plausible neural networks and coding and gained much experience in setting up experiments and analysing the experimental results. Many thanks to the ULTRACEPT project for supporting my research at Guangzhou University, and to my host, Prof. Jigen Peng for kindly providing me access to his well-equipped lab.
Fang Lei enrolled as a PhD Scholar at the University of Lincoln in 2019. In early 2020 she visited Guangzhou University as part of the STEP2DYNA project funded by the European Union’s Horizon 2020 Research and Innovation Programme under the Marie Skolodowska-Curie grant agreement. During this secondment Fang Lei was working on developing bio-inspired visual systems for collision detection in dim light environments. More recently, Fang continued this work during her 12 month secondment at Guangzhou University under ULTRACEPT from May 2020 to 2021.
During the secondment to Guangzhou University, I was working on developing bio-inspired visual systems for collision detection in dim light environments. For the autonomous navigation of vehicles or robots, it is a challenging task to detect moving objects in extremely low-light conditions due to very low signal-to-noise ratios (SNRs). However, nocturnal insects possess remarkable visual abilities in perceiving motion cues and detecting moving objects in very dim light environments. There are many studies on the night vision of insects’ visual systems, which provide us with a lot of inspirations for enhancing motion cues and modelling an artificial visual system to detect motion like looming objects. Fig. 1 shows an example image of looming motion in a dim light environment which is from the low-light video motion (LLVM) dataset obtained by the experimental devices (see Fig. 2).
In order to develop more ideas and experiences in my modelling work, I discussed this with other colleagues and Prof. Peng (see Fig. 3) and got very useful suggestions. We discussed mainly the biological modelling of direction selectivity of LGMD1. We also organized a group seminar every week to discuss the related problems we encounter in our research projects, and I also got a lot of valuable experiences on bio-inspired modelling by sharing our ideas.
For my research work, collision detection in a dim light environment includes the modelling work of direction selectivity of LGMD1 neuron and the motion cues enhancement. I have developed the new LGMD1 model which is effective in distinguishing looming motion from translating motion. I have published one conference paper and attended the online virtual conference (IJCNN 2021, see Fig. 4). I also submitted one journal paper to IEEE transactions on neural networks and learning systems (NNLS) which is under review. Additionally, I have finished the modelling work of motion cues enhancement and proposed a new model. Fig. 5 shows the enhancement results of the captured dark image sequences during testing experiments.
During this 12-month secondment, I have a better knowledge of bio-inspired modelling and obtain a lot of exercises of connection between theory and practice. I established good friendships with my colleagues through frequent communications in every week’s group seminar, which provide a basis for future cooperation. The secondment was a very precious experience for me. Many thanks to ULTRACEPT project for supporting my research work and providing me with the opportunity to work together with my colleagues.
Hongxin Wang received his PhD in computer science from the University of Lincoln in 2020. Following a secondment under the STEP2DYNA project, Dr Wang carried out a further secondment under the ULTRACEPT project from April 2020 to April 2021 at partner Guangzhou University. Here, he undertook research contributing to work packages 1 and 2. Dr Wang’s ULTRACEPT contributions have involved directing the research into computational modelling of motion vision neural systems for small target motion detection.
University of Lincoln’s Experienced Researcher Dr Hongxin Wang recently completed a 12 month secondment at ULTRACEPT project partner Guangzhou University in China. The project is funded by the European Union’s Horizon 2020 Research and Innovation Program under the Marie Skolodowska-Curie grant agreement. Dr Wang visited Guangzhou from April 2020 to April 2021 and contributed to Work Package 1 and 2.
Dr Wang reflects on what he has achieved during secondment
Monitoring moving objects against complex natural backgrounds is a huge challenge to future robotic vision systems, and even more so when attempting to detect small targets only a few pixels in size, for example, an unmanned aerial vehicle (UAV) or a bird in the distance, as shown in Fig. 1. Surprisingly, insects are quite apt at searching for mates and tracking prey, which appears as small dim speckles in the visual field. The exquisite sensitivity of insects for small target motion comes from a class of specific neurons called small target motion detectors (STMDs). Building a quantitative STMD model is the first step for not only further understanding the biological visual system but also providing robust and economic solutions of small target detection for an artificial vision system.
During this twelve-month secondment, I continued my previous work on modeling insects’ visual systems for small target detection and have made great progress. Specifically, we proposed a STMD-based model with time-delay feedback to achieve superior detection performance for fast-moving small targets, whilst significantly suppressing background false positive movements which display lower velocities. This work has been submitted to IEEE Transactions on Neural Networks and Learning Systems and is currently under review. In addition, we developed an attention-prediction guided visual system to overcome the heavy dependency of the existing models on target contrast to background, as illustrated in Fig. 2. The paper presenting this work has been completed and will be submitted to IEEE Transactions on Cybernetics.
During my 12 month secondment at Guangzhou University, I obtained inspiration and mathematical theory support from Professor Jigen Peng to design the STMD-based visual systems. We organized a seminar every week to discuss the latest biological findings, explore effective neural modeling methods, and develop specialised mathematical theory for bioinspired motion detection. Significant progress was made under the help of Professor Jigen Peng.
The secondment has also provided me with an opportunity to improve my mathematical ability with support from Professor Peng. Strong mathematical ability helps me better describe the insects’ visual systems, and build robust neural models for small target motion detection. In addition, I established a deep friendship with Professor Peng and my colleagues at Guangzhou University, which is providing me a basis for future research collaborations. Lastly, I introduced our research to colleagues during the discussion, which may attract their attention to our research field and finally boost the development of neural system modelling.
The secondment has been an excellent experience for me and provided me the opportunity to collaborate with my project colleagues. Thank you for the support from the ULTRACEPT project which benefited me a lot.
Azreen Azman is an associate professor at the Universiti Putra Malaysia in Kuala Lumpur. He has just completed a 6 month secondment at the University of Lincoln and a 6 month secondment at Visomorphic Technology Ltd as part of the ULTRACEPT project funded by the European Union’s Horizon 2020 Research and Innovation Programme under the Marie Skolodowska-Curie grant agreement. He has been involved in Work Packages 2 and 3.
Hazard perception and collision detection are important components for the safety of an autonomous car, and it becomes more challenging in low light environment. During the twelve month secondment period my focus was to investigate the method for the detection of objects on the road in low light conditions by using captured images or video in order to recognise hazards or avoid collision.
One of the first tasks Azreen conducted in Lincoln was to collect audio-visual data in different road conditions. Azreen had the opportunity to join his colleagues Siavash Bahrami and Assoc Prof Shyamala Doraisamy from UPM who were also carrying out ULTRACEPT secondments at UoL and conducting audio-visual recordings of the road at the Millbrook Proving Ground in Bedford, United Kingdom. This provided a controlled environment in addition to other recordings conducted on normal roads.
It is anticipated that the performance of deep-learning based object detection algorithms such as R-CNN variants and YoLo diminishes as the input images become darker, due to the reduced amount of light and increased noise in the captured images. In Azreen’s preliminary experiment which used the Faster R-CNN model trained and tested on a collection of self-collected road images, the object detection performance is significantly reduced to almost 81% for dark and noisy images, as compared to the daylight images.
To overcome the problem, an image enhancement and noise reduction method was applied to the dark images prior to the object detection module. In his investigations, Azreen trained the LLNet, a deep autoencoder based image enhancement and noise reduction method for dark image enhancement. As a result, the Faster R-CNN is able to detect 29% more objects on the enhanced images as compared to the dark images. The performance of the deep learning-based LLNet is better than the conventional Histogram Equalisation (HE) and Retinex methods. However, the patches prediction and image reconstruction steps are computationally expensive for real-time applications.
In August 2020, Azreen began his secondment at Visomorphic Technology Ltd, an industry partner for the ULTRACEPT project. In collaboration with the team, he continued working on the model to improve its efficiency for real-time application. His focus was to adopt the principles of the nocturnal insect vision system for image enhancement and object detection.
During Azreen’s stay in the UK, he attended and presented at the annual ULTRACEPT mid-term project meeting which was held in February 2020 and hosted in Cambridge. Azreen presented his work ‘Detection of objects on the road in low light condition using deep learning’. He also participated in ULTRACEPT Sandpit Session 1 facilitated by Qinbing Fu.
In addition, Azreen attended the first Lincoln Conference on Intelligent Robots and Systems organised by Lincoln Centre of Autonomous Systems (L-CAS) and the Keynote Session delivered by Prof. Graham Kendall from the University of Nottingham on Hyper-heuristics, both held in October 2020.
‘The secondment has given me the opportunities and resources to conduct my research for the project and to improve my skills and networking though various meetings and discussions. Despite the challenges faced due to the ongoing pandemic, both of my hosts (University of Lincoln and Visomorphic Technology Ltd) have provided me with the support to work remotely while continuously engaging with other researchers virtually. I would like to thank the sponsors including Universiti Putra Malaysia and the ULTRACEPT’s Marie Sklodowska-Curie secondment grant for these opportunities.’ Azreen Azman
Dr Qinbing Fu received his PhD at University of Lincoln, in October 2018. Following a secondment under the STEP2DYNA project, Dr Fu carried out a further secondment under the ULTRACEPT project from August 2019 to August 2020 at partner Guangzhou University. Here he undertook research contributing to work packages 1 and 4. Dr Fu then went on to work as a postdoctoral researcher with Professor Shigang Yue until January 2021. Dr Fu’s ULTRACEPT contributions have involved directing the research into computational modelling of motion vision neural systems and applications on robotics. His research achievements and outputs for this project thus far is outlined in this blog post.
In support of the ULTRACEPT project, Dr Fu has published seven research papers including five journal papers and two conference papers. He was the first author on five of the publications and co-authored the other two. His main achievements have included:
The modelling of LGMD-1 and LGMD-2 collision perception neural network models with applications on robot and vehicle scenarios;
The modelling of Drosophila motion vision neural system for decoding the direction of foreground translating object in moving cluttered background;
A review on the related field of research;
Multiple neural system models integration for collision sensing.
Dr Fu’s research outputs can be found on his personal web pages on Google Scholar and ResearchGate. In addition, Qinbing directed promising research on building visually dynamic walls in an arena to test the on-board visual system. These research ideas have been collated and summarised in his research papers.
Dr Fu’s research contributions have fully supported ULTRACEPT’s WP1 and WP4. This includes modelling work on collision detection visual systems with systematic experiments on vehicle scenarios and also the integration of multiple neural system models for motion perception.
Secondment at Guangzhou University, China
Dr Fu carried out his ULTRACEPT secondment at project partner GZHU in China where he worked with Professor Jigen Peng. During this period he developed his capability on several aspects, becoming a more mature researcher in the academic community. This included: aspiring to progressive research ideas, collaboration with group members on completing research papers, coordinating teamwork, disseminating the project, good communication experience with global partners, and writing project proposals. Undoubtedly, the ULTRACEPT secondment for Dr Fu has been very successful.
Dr Fu has undertaken a number of dissemination activities to promote the ULTRACEPT research outcomes. On the 28th July 2020, he presented his research at the ULTRACEPT online Workshop 2on the topic of “Adaptive Inhibition Matters to Robust Collision Perception in Highly Variable Environments”. At this event, he exchanged ideas with project partners.
Tian Liu enrolled as a PhD Scholar at the University of Lincoln in 2018. In 2018-2019 he visited Guangzhou University as part of the STEP2DYNA project funded by the European Union’s Horizon 2020 Research and Innovation Programme under the Marie Skolodowska-Curie grant agreement. During this secondment Tian Liu developed the ColCOSΦ experiment platform for social insects and swarm robotic researching. Tian investigated how multiple virtual pheromones impact on the swarm robots. More recently, Tian completed a 12 month secondment under ULTRACEPT at Guangzhou University.
Tian Liu recently completed his second 12 month secondment at project partner Guangzhou University in China as part of the ULTRACEPT project funded by the European Union’s Horizon 2020 Research and Innovation Programme under the Marie Skolodowska-Curie grant agreement. Tian visited Guangzhou from November 2019 to November 2020 and has been involved in Work Package 1 and 4.
Tian reflects on what he has achieved during his time in Guangzhou
Most social insects, such as ants, only have a tiny brain. However, they can complete very difficult and complex tasks with a large number of individuals cooperating. Examples include building a large nest or collecting food through rugged routes. They are able to do this because the pheromones act as an important communication medium.
During this 12 month secondment, I continued to focus my attention on swarm robots with multiple pheromones. I believed that it is the interaction of multiple pheromones that enables insects to perform such demanding tasks, rather than the single pheromone mechanism which is now so widely studied. I worked with ULTRACEPT researcher Xuelong Sun and Dr. Cheng Hu to develop the ColCOSΦ, which can easily implement multiple pheromone research experiments. We verified the application and evaluation of the effects of multi-pheromones in swarm robotics by implementing several case studies which simulated ants foraging and carrying out hunting and deployment tasks.
I showcased the outcomes of this research at both ICARM2019 and ICARM2020 international conferences.
Due to its excellent scalability, we also use it for research experiments in related fields. For example, the platform can simulate traffic scenarios so we can test our LGMD model (a collision detection model) by using the micro robot (Colias) in a low-cost way.
Besides olfactory, the visual information is also a very important input for insects, so we implemented a changeable visual environment on the ColCOSΦ for investigating how to make full use of both olfactory and visual information in a swarm task. The research was collated into two articles which have been submitted to ICRA2021 with fellow ULTRACEPT researchers Xuelong Sun, Dr Qinbing Fu and Dr Cheng Hu.
The secondment has been an excellent experience for me and my colleagues and provided me the opportunity to collaborate with my project colleagues.
Many thanks to ULTRACEPT project for supporting my research and for allowing me to work with these outstanding research scholars.
Xuelong Sun enrolled as a PhD Scholar at the University of Lincoln in 2016. In 2017-18 he visited Tsinghua University, China as part of the STEP2DYNA project funded by the European Union’s Horizon 2020 Research and Innovation Program under the Marie Skolodowska-Curie grant agreement. During the secondment, Xuelong revisited the classical ring attractor model and demonstrated its application of bio-plausible optimal cue integration of directional cues. More recently, Xuelong completed a 12 month secondment with Guangzhou University under the ULTRACEPT project.
Xuelong Sun recently completed a 12 month secondment at project partner Guangzhou University in China as part of the ULTRACEPT project funded by the European Union’s Horizon 2020 Research and Innovation Program under the Marie Skolodowska-Curie grant agreement. Xuelongvisited Guangzhou from January 2019 to March 2019, then again from July 2019 to May 2020. Xuelong has been involved in Work Package 1 and 3.
Xuelong reflects on what he has achieved during his time in Guangzhou
Solving problems by taking inspiration from animals (so-called bio-inspired solutions) is one of the core ideas of our group-computational intelligence lab (CIL). As for me, insects are my best friend because of their amazing ability to use navigation and efficient collaboration to solve complex problems.
During this ten-month secondment, I continued my previous modeling work of insect navigation systems and have made great progress, by not only reproducing the main observed behavioural data of real insects, but also mapping specific computation to corresponding brain regions of the insects. We are making great contributions to the insect navigation community.
As part of my researching interests cooperating with fellow ULTRACEPT researcher Tian Liu, we developed a platform called ColCOSՓ for social insects and swarm robotic researching. This platform consists of three parts, the arena (LED screen), the monitoring camera, and the micro-robot. Swarm robotic and social insects related experimental scenarios can be easily and flexibly conducted in this platform. Fellow ULTRACEPT researcher Dr Cheng Hu and I presented the platform physically at Guangdong (Foshan) Doctoral and Postdoctoral Talent Exchange and Technology Project Matchmaking Conference.
Another interesting experiment undertaken during my secondment is that we investigated the performance of LGMD model of collision avoidance in the context of city traffic. The real-world vehicle critical conditions always consist of severe crashes which are impractical to be replicated for experimenting, so we implemented the experiment on ColCOSՓ.
I co-authored a paper presenting the interesting results of this experiments and submitted it to Frontiers in Robotics and AI during the secondment in February 2020.
Besides this, I also attended the Convention on Exchange of Overseas Talent (OCS2020) and interviewed by Guangzhou TV. In the interview, I said that as a PhD that obtained the degree from abroad, what kind of career I want and what kind of support should be provided by the government.
See Xuelong being interviewed at the 1:32 mark:
I had a really great experience with my colleagues during the secondment.
Thank you for the support from the ULTRACEPT project which supported my secondment which benefited me a lot.
Jiannan Zhao enrolled as a PhD Scholar at the University of Lincoln in 2016. In 2017-18 he visited Tsinghua University as part of the STEP2DYNA project funded by the European Union’s Horizon 2020 Research and Innovation Programme under the Marie Skolodowska-Curie grant agreement. During this secondment Jiannan developed the first generation of “locust-inspired collision detector for UAV” and demonstrated real flight with the bio-inspired algorithm on embedded system.
Jiannan has just completed his second 12 month secondment at the Guangzhou University in China as part of the ULTRACEPT project funded by the European Union’s Horizon 2020 Research and Innovation Programme under the Marie Skolodowska-Curie grant agreement. He has been involved in Work Package 1 and 4.
The ultimate objective of my PhD research has been to develop an automatic UAV platform with bio-inspired collision avoidance system. The aim of my secondment to Guangzhou University was to realise agile autonomous UAV flight based on LGMD collision detector.
During my secondment I analysed the challenges during 3D movement of the UAV flight and modelled a novel neural network to overcome these challenges.
The existing algorithms were inadequate for flight scenes. To fully achieve flexible automatic flight the algorithms needed to be enhanced to ensure they were robust against dynamic background noise. During my secondment to Guangzhou University I worked on modelling a robust and efficient locust-inspired algorithm for collision detection. Based on distributed presynaptic interconnections, I have developed a novel model appropriate for agile UAV flight, which can easily filter out insignificant visual cues by discriminating the angular velocity of images.
This model is robust for detecting near range emergent collision in dynamic backgrounds as demonstrated in the following video:
In the next phase of my research, the computational algorithm will be transplanted to embedded systems to achieve efficient automatic flight.
I also joined a group of four Tsinghua University robotic students and competed in the first International Competition for Autonomous Running Intelligent Robots in Beijing. We successfully competed against 32 other teams to take first prize. Read more about the competition here.
These Marie Sklodowska-Curie secondments have provided me access to facilities and recording equipment needed for setting up the UAV platform. Moreover, the weekly meetings with other colleagues of the project has broaden my sights and boosted my research skills.
Huatian Wang received his BSc and MSc degree in Applied Mathematics from Xi’an Jiaotong University in 2014 and 2017, respectively. He was awarded the Marie Curie Fellowship to be involved in the EU FP7 project LIVCODE (295151) as a Research Assistant in 2016.
Huatian enrolled as a PhD scholar at the University of Lincoln in January 2017. During his PhD, he carried out a 12 month secondment as an Early-Stage Researcher for the European Union’s Horizon 2020 STEP2DYNA (691154) project from 2017-18 at Tsinghua University. Following this, Huatian carried further secondments under the European Union’s Horizon 2020 ULTRACEPT (778062) project from 2019-2020. This included 1 month at Guangzhou University (GZHU), then 11 months at Xi’an Jiaotong University (XJTU). His research areas include image processing, insect vision and motion detection.
I was mainly involved in the ULTRACEPT Work Package 1. The research focuses on modelling the visual processing systems of the flying insects like Drosophila and honeybees. Their extraordinary navigation ability in cluttered environments provide perfect inspiration for designing artificial neural networks. It can be used to guide the visual flight of micro air vehicles.
Although insects like flies and honeybees have tiny brains, they can deal with very complex visual flight tasks. Research has been undertaken for decades to understand how they detect visual motion. However, the neural mechanisms to explain their variety of behaviours, including patterned tunnel centring and terrain following, are still not clear. According to the honeybee behavioural experiments performed, the key to their excellent flight control ability is the angular velocity estimation and regulation.
To solve the fundamental problem of the angular velocity estimation, we proposed a novel angular velocity decoding model for explaining the honeybee’s flight behaviours of tunnel centring and terrain following, capable of reproducing observations of the large independence to the spatial frequency and contrast of the gratings in visually guide flights of honeybees. The model combines both temporal and texture information to decode the angular velocity. The angular velocity estimation of the model is little affected by the spatial frequency and contrast in synthetic grating experiments. The model is also tested behaviourally in Unity with the tunnel centring and terrain following paradigms. A demo video can be found on YouTube here. The simulated bee flies over a textured terrain using only ventral visual information to avoid collision.
During my secondment, I presented a poster as part of our work at the IJCNN 2019 conference in Budapest which you can read about here. This gave me the opportunity to share my research with the scientific community at the conference. The picture shows the communication I had with other researchers during the poster session.
These secondments have provided me with the opportunity to work with leading academics in this field of research. For example, I was able to discuss the mathematical model of elementary motion detection and the signal simulation using sinusoidal gratings with Prof. Jigen Peng at GZHU, as well as the sparse reconstruction method in compressing sensing theory with Dr. Angang Cui at XJTU.
I also worked alongside fellow researchers. For example, I helped Dr. Qinbing to build up a database about the Collision Detection in various automotive scenes. We collected videos using a dashboard camera and made suitable cuts using video editing software.
I also attended numerous seminars and guest lectures. For example, I attended a seminar on solving sparse linear system using smooth approximation methods. These experiences helped me to develop my skills and knowledge and to further my research.
During the final two months of my secondment I had to work from my home in China since the university closed due to Covid-19. However, I was able to use this time to carry out video conference discussions with my supervisors both in Xian and Lincoln. I also used my desktop computer to run simulation experiments and spent time preparing academic research papers.
Thanks to the support of the ULTRACEPT project, I was able to introduce our work to other groups and attract their attention to this research field, which is helpful for improving the impact of our research.
During my one-year secondment in China, I established a friendship with Prof. Peng and other colleagues at Guangzhou University and Xi’an Jiaotong University. The cooperation with colleagues of these institutions boosted the development of the neural modelling for visual navigation. I was also able to introduce ULTRACEPT Project to other researchers in GU and XJTU. The mathematical analysing ability has been significantly improved during the cooperation with Prof. Peng. The programming ability has also been improved with my colleagues’ help.
Shyamala Doraisamy is an Associate Professor at the Department of Multimedia, Faculty of Computer Science and Information Technology, University Putra Malaysia (UPM). UPM is a partner in the ULTRACEPT project and Dr. Doraisamy is UPM’s Partner Lead.
Dr. Doraisamy featured in the final EURAXESS ASEAN Newsletter for 2019. The article explained how UPM became involved in the ULTRACEPT project and what their role has been in the consortium.
Dr. Doraisamy received her PhD from Imperial College London in 2004, specializing in the field of Music Information Retrieval and won an award for her music and computing innovation at the Invention and New Product Exposition (INPEX), Pittsburgh, USA in 2007. Her research interest includes Multimedia Information Processing, focusing in particular on sound analysis and has completed several projects on music and health applications. She has been an invited speaker at various conference and research meetings internationally.
Dr. Doraisamy is an active member of the Malaysian Information Retrieval and Knowledge Management Society and was the Chair of the 2018 IEEE 2018 International Conference on Information Retrieval and Knowledge Management Conference (CAMP’18).
During 2019, Dr. Doraisamy has been on secondment at the University of Lincoln (UoL) along with Early Stage Researcher (ESR) Siavash Bahrami and Experienced Researcher (ER) Azreen Azman.
UPM has been working on the theme of road safety related to the project. The tasks assigned to UPM were mainly based on the work packages, WP2, WP3 and WP4. The team has focused in particular on contributions for task 2.3 in WP2 – ‘To develop long range hazard perception methods coping with low light conditions’.
Dr. Doraisamy secondment has included initial meetings with partners and the completion of proposal discussions for a collaborative PhD research work with Siavash Bahrami. The tasks completed have been based on this collaborative PhD research being co-supervised by UPM and UoL. The research involves investigating the use of sound data for road wetness levels estimation to support the development of long range hazard methods coping with low light conditions. You can read more about Siavash’s research here.
The team will continue to utilise audiovisual technologies towards the development of Brain-inspired vision systems for long-range hazard perception (WP2).
The article in the EURAXESS ASEAN Newsletter also highlights how participation in an MSCA-RISE can be beneficial for Malaysian research groups and Dr. Doraisamy also provides advice on getting involved in future RISE consortia.
Ultra-layered perception with brain-inspired information processing for vehicle collision avoidance
Viewing Message: 1 of 1. Notice