Category Archives: NEWS

Development of an Angular Velocity Decoding Model Accounting for Honeybees’ Visually Guided Flights

Huatian Wang received his BSc and MSc degree in Applied Mathematics from Xi’an Jiaotong University in 2014 and 2017, respectively. He was awarded the Marie Curie Fellowship to be involved in the EU FP7 project LIVCODE (295151) as a Research Assistant in 2016.

Huatian enrolled as a PhD scholar at the University of Lincoln in January 2017. During his PhD, he carried out a 12 month secondment as an Early-Stage Researcher for the European Union’s Horizon 2020 STEP2DYNA (691154) project from 2017-18 at Tsinghua University. Following this, Huatian carried further secondments under the European Union’s Horizon 2020 ULTRACEPT (778062) project from 2019-2020. This included 1 month at Guangzhou University (GZHU), then 11 months at Xi’an Jiaotong University (XJTU). His research areas include image processing, insect vision and motion detection.Huatian Wang

I was mainly involved in the ULTRACEPT Work Package 1. The research focuses on modelling the visual processing systems of the flying insects like Drosophila and honeybees. Their extraordinary navigation ability in cluttered environments provide perfect inspiration for designing artificial neural networks. It can be used to guide the visual flight of micro air vehicles.

Although insects like flies and honeybees have tiny brains, they can deal with very complex visual flight tasks. Research has been undertaken for decades to understand how they detect visual motion. However, the neural mechanisms to explain their variety of behaviours, including patterned tunnel centring and terrain following, are still not clear. According to the honeybee behavioural experiments performed, the key to their excellent flight control ability is the angular velocity estimation and regulation.

To solve the fundamental problem of the angular velocity estimation, we proposed a novel angular velocity decoding model for explaining the honeybee’s flight behaviours of tunnel centring and terrain following, capable of reproducing observations of the large independence to the spatial frequency and contrast of the gratings in visually guide flights of honeybees. The model combines both temporal and texture information to decode the angular velocity. The angular velocity estimation of the model is little affected by the spatial frequency and contrast in synthetic grating experiments. The model is also tested behaviourally in Unity with the tunnel centring and terrain following paradigms. A demo video can be found on YouTube here. The simulated bee flies over a textured terrain using only ventral visual information to avoid collision.

During my secondment, I presented a poster as part of our work at the IJCNN 2019 conference in Budapest which you can read about here. This gave me the opportunity to share my research with the scientific community at the conference. The picture shows the communication I had with other researchers during the poster session.

Huatian Wang attending the International Joint Conference on Neural Networks (IJCNN) July 2019
Huatian Wang attending the International Joint Conference on Neural Networks (IJCNN) July 2019

I also attended and presented my work at the ULTRACEPT mid-term meeting in February 2020 which you can read about here. Due to Covid-19 travel restrictions, I was not able to attend the event in person. Instead, I attended and presented via video conference.

Huatian Wang presenting at the ULTRACEPT mid-term meeting Feb 2020
Huatian Wang presenting at the ULTRACEPT mid-term meeting Feb 2020

These secondments have provided me with the opportunity to work with leading academics in this field of research. For example, I was able to discuss the mathematical model of elementary motion detection and the signal simulation using sinusoidal gratings with Prof. Jigen Peng at GZHU, as well as the sparse reconstruction method in compressing sensing theory with Dr. Angang Cui at XJTU.

I also worked alongside fellow researchers. For example, I helped Dr. Qinbing to build up a database about the Collision Detection in various automotive scenes. We collected videos using a dashboard camera and made suitable cuts using video editing software.

I also attended numerous seminars and guest lectures. For example,  I attended a seminar on solving sparse linear system using smooth approximation methods. These experiences helped me to  develop my skills and knowledge and to further my research.

During the final two months of my secondment I had to work from my home in China since the university closed due to Covid-19. However, I was able to use this time to carry out video conference discussions with my supervisors both in Xian and Lincoln. I also used my desktop computer to run simulation experiments and spent time preparing academic research papers.

Thanks to the support of the ULTRACEPT project, I was able to introduce our work to other groups and attract their attention to this research field, which is helpful for improving the impact of our research.

 During my one-year secondment in China, I established a friendship with Prof. Peng and other colleagues at Guangzhou University and Xi’an Jiaotong University. The cooperation with colleagues of these institutions boosted the development of the neural modelling for visual navigation. I was also able to introduce ULTRACEPT Project to other researchers in GU and XJTU. The mathematical analysing ability has been significantly improved during the cooperation with Prof. Peng. The programming ability has also been improved with my colleagues’ help.

Guangdong ‘Zhongchuang Cup’ Entrepreneurship and Innovation Competition

The 2019 Guangdong “Zhongchuang Cup” Entrepreneurship and Innovation Competition was held from 18th to 19th  September 2019 in Jiangmen, China.

The aim of this post-doctoral innovation competition is to transform potential scientific research into business by developing doctoral talents. There were close to 400 attendees at the event which included the Ministry of Human Resources and Social Security, Guangdong Provincial Department of Human Resources and Social Security, relevant leaders of Jiangmen City, expert judges, and postdoctoral talents.

Cheng Hu from the University of Lincoln, who is currently on secondment at Guangzhou University, competed at the event where he showcased the Colias robot platform.

Cheng Hu at the Guangdong 'Zhongchuang Cup' Entrepreneurship and Innovation Competition
Cheng Hu at the Guangdong ‘Zhongchuang Cup’ Entrepreneurship and Innovation Competition

The postdoctoral innovation competition attracted 476 projects from six strategic emerging industries including biomedicine and general health, electronic information, new energy, energy conservation and environmental protection, new materials, the Internet and mobile Internet, and advanced manufacturing .

Out of the initial 476 projects, 328 entered the preliminary round. Of these, 60 outstanding projects advanced to the semi-finals, which included Cheng. As a result of advancing to the semi-finals, Cheng was invited to participate in innovative counselling training delivered by the National Postdoctoral Innovation (Jiangmen) Demonstration Center and Guangzhou Leading Human Resources Development Co. Ltd. Training included “Business Plan Writing” and “Business Project Roadshow Training”

Award winners at the Guangdong 'Zhongchuang Cup' Entrepreneurship and Innovation Competition
Award winners at the Guangdong ‘Zhongchuang Cup’ Entrepreneurship and Innovation Competition

From these 60 semi-finalists, 24 elite projects reached the finals. Cheng achieved an outstanding result and received the “winning award” ranking 12 of 30 in the semi-final, and 9 of 12 in the final.

Cheng Hu receiving award at the Guangdong 'Zhongchuang Cup' Entrepreneurship and Innovation Competition
Cheng Hu receiving award at the Guangdong ‘Zhongchuang Cup’ Entrepreneurship and Innovation Competition

Guangdong (Foshan) Doctoral and Postdoctoral Talent Exchange and Technology Project Matchmaking Conference

Whilst on secondment at ULTRACEPT partner Guangzhou University, Xuelong Sun and Dr Cheng Hu from the University of Lincoln attended the Guangdong (Foshan) Doctoral and Postdoctoral Talent Exchange and Technology Project Matchmaking Conference.

Each city of Guangdong province is provided a space in which to display their best innovation and entrepreneurship examples of 2019. As one of the city-governed universities, Guangzhou University was asked to select its most attractive and novel research to showcase at the  event. Xuelong and Cheng were recommended by Guangzhou University to give a demonstration of their ColCOSP system and Colias robot platform.

The event was hosted by the Guangdong Provincial Department of Human Resources and Social Security and the Foshan Municipal People’s Government. It was held on November 18, 2019 at the Tanzhou International Convention and Exhibition Center  in Foshan, China in the Guangdong-Hong Kong-Macao Greater Bay Area. Nearly 1,200 people from all over the world gathered in Foshan to participate. The opening ceremony was chaired by Qiao Yu, deputy mayor of the Foshan Municipal Government.

Xuelong Sun and Cheng Hu at the Guangdong Doctoral and Postdoctoral Talent Exchange and Technology Project Matchmaking Conference
Xuelong Sun and Cheng Hu at the Guangdong Doctoral and Postdoctoral Talent Exchange and Technology Project Matchmaking Conference

The conference lasted two days, with the theme of “bringing talents from all over the world and creating a talent bay area”, with the key aim of “promoting exchange and cooperation of post-doctoral talents and serving the transformation of post-doctoral scientific and technological achievements”.

At the event, Xuelong and Cheng gave a demonstration of their ColCOSP system and the Colias robot platform to the esteemed delegates and to the wider community of the Robotics and AI. They also exchanged some knowledge, ideas and interests concerning Robotics and AI with researchers from the related fields. Xuelong added  “It was a great chance to let more people know our platforms and researching supported by EU H2020 project”.

 

Dr Shyamala Doraisamy Featured in the EURAXESS ASEAN Newsletter

Shyamala Doraisamy is an Associate Professor at the Department of Multimedia, Faculty of Computer Science and Information Technology, University Putra Malaysia (UPM). UPM is a partner in the ULTRACEPT project and Dr. Doraisamy is UPM’s Partner Lead.

Dr. Doraisamy featured in the final EURAXESS ASEAN Newsletter for 2019. The article explained how UPM became involved in the ULTRACEPT project and what their role has been in the consortium.

Dr. Doraisamy received her PhD from Imperial College London in 2004, specializing in the field of Music Information Retrieval and won an award for her music and computing innovation at the Invention and New Product Exposition (INPEX), Pittsburgh, USA in 2007. Her research interest includes Multimedia Information Processing, focusing in particular on sound analysis and has completed several projects on music and health applications. She has been an invited speaker at various conference and research meetings internationally.

Dr. Doraisamy is an active member of the Malaysian Information Retrieval and Knowledge Management Society and was the Chair of the 2018 IEEE 2018 International Conference on Information Retrieval and Knowledge Management Conference (CAMP’18).

During 2019, Dr. Doraisamy has been on secondment at the University of Lincoln (UoL) along with Early Stage Researcher (ESR) Siavash Bahrami and Experienced Researcher (ER) Azreen Azman.

UPM secondees
Shyamala with UPM researchers Azreen Bahrami (L) and Siavash Bahrami (R)

UPM has been working on the theme of road safety related to the project. The tasks assigned to UPM were mainly based on the work packages, WP2, WP3 and WP4. The team has focused in particular on contributions for task 2.3 in WP2 – ‘To develop long range hazard perception methods coping with low light conditions’.

Dr. Doraisamy secondment has included initial meetings with partners and the completion of proposal discussions for a collaborative PhD research work with Siavash Bahrami. The tasks completed have been based on this collaborative PhD research being co-supervised by UPM and UoL. The research involves investigating the use of sound data for road wetness levels estimation to support the development of long range hazard methods coping with low light conditions.  You can read more about Siavash’s research here.

The team will continue to utilise audiovisual technologies towards the development of Brain-inspired vision systems for long-range hazard perception (WP2).

The article in the EURAXESS ASEAN Newsletter also highlights how participation in an MSCA-RISE can be beneficial for Malaysian research groups and Dr. Doraisamy also provides advice on getting involved in future RISE consortia.

 

Development of an Autonomous Flapping Wing Robot Locust – Linloc

Hamid Isakhani received B.Eng. in Aeronautics from Visvesvaraya Technological University, and an MSc by Research in Aerospace from Cranfield University in 2015 and 2017, respectively. He was an intern engineer at the Indian Space Research Organisation and Hindustan Aeronautics Limited during the years 2012 and 2014, respectively. He is currently a PhD scholar at the School of Computer Science, University of Lincoln since 2017, and visited Tsinghua University as a Marie Skłodowska-Curie Fellow from 2018-19. As of July 2019, He is seconded to the Huazhong University of Science and Technology to support the progress of the following Tasks and Work Packages included as part of the Project –Ultracept funded by the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska -Curie grant agreement 778062:

  • WP4Systems integration, miniaturization, verification and demonstration

Task 4.4: To build a demonstrator system for the collision avoidance

Ultimately, my doctoral project aims to develop a locust-inspired articulated wing robotic platform capable of performing autonomous flight. This platform shall serve as a demonstrator/testbed to validate the robustness of the vision based collision avoidance system developed as part of the first three work packages included in the projects Step2Dyna and Ultracept. Additionally, completion of this project shall result in the development of a fully-bioinspired flapping wing micro aerial vehicle that is the first of its kind in terms of being entirely inspired by an insect’s flight mechanics, aerodynamics, and avionics.

Significant progress was made at our partner institution, The State Key Laboratory of Digital Manufacturing Equipment and Technology, HUST. Rapid prototyping, manufacturing, and motion capture analysis were some of the vital stages of our project facilitated at this institution.

Hamid Isakhani working on additive manufacturing of locust wing prototypes
Hamid Isakhani working on additive manufacturing of locust wing prototypes

To study the flapping mechanism and gliding behaviour of swarming locusts, we setup a small locust colony at HUST, where we bred adult locusts carefully selected for their physical characteristics indicative of health (strong free-flight ability, good wing condition, etc.).

Small locust colony housing approximately 300 adult Schistocerca gregaria
Small locust colony housing approximately 300 adult Schistocerca gregaria

Specimen are anesthetized in a CO2 chamber for 5 minutes to ease the inlaying of custom-made micro sized retroreflective markers. Each hemispherical marker weighs less than 0.1±0.05mg and 0.5mm in radius to facilitate precise tracking and centroid calculation by the cameras.

CO2 Chamber and retroreflector marked locust
CO2 Chamber and retroreflector marked locust

Generally, due to the resolution limitations of high-speed cameras, capture volumes are highly cramped. Therefore, there is very limited literature on the study of insect free-flight kinematics and their swarm behavior, i.e. the flying insects under experimentation are either tethered or flown in confined flight chambers. However, with the help of a set of well calibrated infrared-based Vicon motion capture system (Vicon, Oxford, UK), consisting of three MX-T160 and four MX-T40s cameras arranged to provide a tracking volume of 0.6×0.6×0.6m, the three-dimensional position and orientation data of a micro swam of gliding locusts are successfully recorded.

Vicon motion capture system consisting of MX-T160 and MX-T40s cameras
Vicon motion capture system consisting of MX-T160 and MX-T40s cameras

Nevertheless, the scientific input and assistance provided by the project director Prof. Yue and the host faculty, Prof. Xiong played a key factor in accomplishing the afore mentioned tasks.
Furthermore, the measured data vectors must be post-processed to derive the kinematic information required to integrate and design an efficient bioinspired wing articulation mechanism mimicking a gliding locust airborne.

Road pavement condition detection from Acoustic signals: A Deep Learning approach.

Siavash Bahrami is undertaking his PhD at project partner University Putra Malaysia (UPM). Siavash is currently on secondment to the University of Lincoln as an Early Stage Researcher. He is supporting the following Work Packages and Tasks as part of the Ultracept Project which is funded by European Union’s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant agreement 778062:

  • WP2 – Brain-inspired vision systems for long-range hazard perception

Task 2.3: To develop long range hazard perception methods coping with low light conditionsSiavash Bahrami standing by car

During my secondment at the University of Lincoln, I am working on the development of a neural network model utilizing acoustic signals. This research will be useful where sound data for road wetness levels estimation would be able to support the development of long-range hazard methods coping with low light conditions.

Attaching microphone to vehicle
Attaching microphone to vehicle

My preliminary experiment was performed on a dataset of tyre recordings available on “Detecting Road Surface Wetness from Audio with Recurrent Neural Networks” to evaluate the proposed CNN architectures. MFCCs were used as acoustic signal features to classify the wet and dry road for preliminary investigation which resulted in 98.7% accuracy.

Microphone recording tyre sound
Microphone recording tyre sound
Microphone in vehicle recording road sound
Microphone in vehicle recording road sound

In the next phase of my study, I’m going to develop a larger dataset, where recording equipment has been purchased and data collection has just begun on several roads in Lincoln.

I am also looking forward to my next Ultracept secondment with the project partner Visomorphic Technology Ltd.

This Marie Sklodowska-Curie secondment has given me access to facilities and recording equipment needed for compiling the dataset needed for both my PhD studies and the Ultracept project. In addition, the weekly meetings with other members of the project has given me the opportunity to discuss and broaden my knowledge in various related research fields.

Joint Tsinghua University and the University of Lincoln Team Wins Robotics Championship

Jiannan Zhao with his award at the Robotics Championship 2019
Jiannan Zhao with his award at the Robotics Championship

In exciting news for Tsinghua University and the University of Lincoln, their joint robotics team has taken first prize in the first International Competition for Autonomous Running Intelligent Robots. The team consisted of five students, four from Tsinghua and 1 from Lincoln, who were awarded a total prize of $10,000.

The student from the University of Lincoln, Jiannan Zhao, supports our Ultracept project and is currently carrying out a secondment at Guangzhou University.

Tsinghua University is one of the partners involved with our Ultracept project.

The event was held in Beijing with 33 teams competing with representatives from over 20 countries and regions.

The competition was judged on the performance of the robot across three segments:

  • Standard
  • Freestyle
  • Performance

During the competition, the robots had to rely solely on their “brains” as no operator assistance was allowed. The robot had to autonomously analyse and make decisions on how best to cross barriers and handle variable track conditions.

Watch the robots in action!

See the China Daily website for a video of the event. Watch at the 53 second mark to see the winning robot in action.

Below is a video of the event from CCTV English.