Category Archives: NEWS

Guangdong ‘Zhongchuang Cup’ Entrepreneurship and Innovation Competition

The 2019 Guangdong “Zhongchuang Cup” Entrepreneurship and Innovation Competition was held from 18th to 19th  September 2019 in Jiangmen, China.

The aim of this post-doctoral innovation competition is to transform potential scientific research into business by developing doctoral talents. There were close to 400 attendees at the event which included the Ministry of Human Resources and Social Security, Guangdong Provincial Department of Human Resources and Social Security, relevant leaders of Jiangmen City, expert judges, and postdoctoral talents.

Cheng Hu from the University of Lincoln, who is currently on secondment at Guangzhou University, competed at the event where he showcased the Colias robot platform.

Cheng Hu at the Guangdong 'Zhongchuang Cup' Entrepreneurship and Innovation Competition
Cheng Hu at the Guangdong ‘Zhongchuang Cup’ Entrepreneurship and Innovation Competition

The postdoctoral innovation competition attracted 476 projects from six strategic emerging industries including biomedicine and general health, electronic information, new energy, energy conservation and environmental protection, new materials, the Internet and mobile Internet, and advanced manufacturing.

Out of the initial 476 projects, 328 entered the preliminary round. Of these, 60 outstanding projects advanced to the semi-finals, which included Cheng. As a result of advancing to the semi-finals, Cheng was invited to participate in innovative counseling training delivered by the National Postdoctoral Innovation (Jiangmen) Demonstration Center and Guangzhou Leading Human Resources Development Co. Ltd. Training included “Business Plan Writing” and “Business Project Roadshow Training”

Award winners at the Guangdong 'Zhongchuang Cup' Entrepreneurship and Innovation Competition
Award winners at the Guangdong ‘Zhongchuang Cup’ Entrepreneurship and Innovation Competition

From these 60 semi-finalists, 24 elite projects reached the finals. Cheng achieved an outstanding result and received the “winning award” ranking 12 of 30 in the semi-final, and 9 of 12 in the final.

Cheng Hu receiving award at the Guangdong 'Zhongchuang Cup' Entrepreneurship and Innovation Competition
Cheng Hu receiving award at the Guangdong ‘Zhongchuang Cup’ Entrepreneurship and Innovation Competition

Guangdong (Foshan) Doctoral and Postdoctoral Talent Exchange and Technology Project Matchmaking Conference

Whilst on secondment at ULTRACEPT partner Guangzhou University, Xuelong Sun and Dr Cheng Hu from the University of Lincoln attended the Guangdong (Foshan) Doctoral and Postdoctoral Talent Exchange and Technology Project Matchmaking Conference.

Each city of Guangdong province is provided a space in which to display their best innovation and entrepreneurship examples of 2019. As one of the city-governed universities, Guangzhou University was asked to select its most attractive and novel research to showcase at the event. Xuelong and Cheng were recommended by Guangzhou University to give a demonstration of their ColCOSΦ, system and Colias robot platform.

The event was hosted by the Guangdong Provincial Department of Human Resources and Social Security and the Foshan Municipal People’s Government. It was held on November 18, 2019 at the Tanzhou International Convention and Exhibition Center  in Foshan, China in the Guangdong-Hong Kong-Macao Greater Bay Area. Nearly 1,200 people from all over the world gathered in Foshan to participate. The opening ceremony was chaired by Qiao Yu, deputy mayor of the Foshan Municipal Government.

Xuelong Sun and Cheng Hu at the Guangdong Doctoral and Postdoctoral Talent Exchange and Technology Project Matchmaking Conference
Xuelong Sun and Cheng Hu at the Guangdong Doctoral and Postdoctoral Talent Exchange and Technology Project Matchmaking Conference
Xuelong Sun presenting his exhibit at the Guangdong Doctoral and Postdoctoral Talent Exchange and Technology Project Matchmaking Conference
Xuelong Sun showcasing and demonstrating the ColCOSΦ, system and Colias robot platform at the Guangdong Doctoral and Postdoctoral Talent Exchange and Technology Project Matchmaking Conference

The conference lasted two days, with the theme of “bringing talents from all over the world and creating a talent bay area”, with the key aim of “promoting exchange and cooperation of post-doctoral talents and serving the transformation of post-doctoral scientific and technological achievements”.

At the event, Xuelong and Cheng gave a demonstration of their ColCOSΦ, system and the Colias robot platform to the esteemed delegates and to the wider community of the Robotics and AI. They also exchanged some knowledge, ideas and interests concerning Robotics and AI with researchers from the related fields. Xuelong added  “It was a great chance to let more people know our platforms and researching supported by EU H2020 project”.

Xuelong Sun presenting his exhibit at the Guangdong Doctoral and Postdoctoral Talent Exchange and Technology Project Matchmaking Conference
Xuelong Sun presenting his exhibit at the Guangdong Doctoral and Postdoctoral Talent Exchange and Technology Project Matchmaking Conference

ULTRACEPT researchers invited to speak at International Symposium on Crossmodal Learning in Humans and Robots November 2019

The International Symposium on Crossmodal Learning in Humans and Robots was held in at the Universität Hamburg in Hamburg, Germany on the 27 – 29 November 2019. You can access the symposium agenda here.

The Symposium included invited talks, short updates and research highlights from the CML project research projects, lab visits at the Computer Science campus, and a poster presentation with summaries from the first funding period (2016-2019). They also presented the research outlook for the second funding period (2020-2023), recently approved by the DFG.

This event included invited talks from our ULTRACEPT Beneficiaries:

Wednesday, November 27, 2019

16:30-17:30 Dealing with Motion in the Dynamic World — from Insects’ Vision to Neuromorphic Sensors

  • Shigang Yue, University of Lincoln

Thursday, November 28, 2019

09:00-09:15 Transregional Collaboration Research on Crossmodal Learning in Artificial and Natural Cognitive Systems

  • Jianwei Zhang, Universität Hamburg

Friday, November 29, 2019

15:25-15:55 Torque and Visual Controlled Robot Dexterous Manipulations

  • Zhaopeng Chen, DLR/Agile Robots

 

Tian Liu presents at IEEE 4th International Conference on Advanced Robotics and Mechatronics (ICARM) July 2019

The IEEE 4th International Conference on Advanced Robotics and Mechatronics (ICARM) conference was held in Toyonaka Campus, Osaka University, Osaka, Japan on the 3rd to 5th July 2019.  You can access the conference website here.

The 2019 conference was collaboratively organized by robotic researchers from Osaka University, The University of Tokyo, Nara Institute of Science and Technology, and Ritsumeikan University, Japan. The conference provided an international forum for researchers, educators, engineers in general areas of mechatronics, robotics, automation and sensors to disseminate their latest research results and exchange views on the future research directions of these fields.

Tian Liu presenting at the IEEE 4th International Conference on Advanced Robotics and Mechatronics (ICARM) July 2019
Tian Liu presenting at the IEEE 4th International Conference on Advanced Robotics and Mechatronics (ICARM) July 2019

This conference was attending by ULTRACEPT researcher Tian Liu from the University of Lincoln. Tian presented the following research:

X. Sun, T. Liu, C. Hu, Q. Fu and S. Yue, “ColCOS Φ: A Multiple Pheromone Communication System for Swarm Robotics and Social Insects Research,” 2019 IEEE 4th International Conference on Advanced Robotics and Mechatronics (ICARM), Toyonaka, Japan, 2019, pp. 59-66, doi: 10.1109/ICARM.2019.8833989.

Abstract: In the last few decades we have witnessed how the pheromone of social insect has become a rich inspiration source of swarm robotics. By utilising the virtual pheromone in physical swarm robot system to coordinate individuals and realise direct/indirect inter-robot communications like the social insect, stigmergic behaviour has emerged. However, many studies only take one single pheromone into account in solving swarm problems, which is not the case in real insects. In the real social insect world, diverse behaviours, complex collective performances and flexible transition from one state to another are guided by different kinds of pheromones and their interactions. Therefore, whether multiple pheromone based strategy can inspire swarm robotics research, and inversely how the performances of swarm robots controlled by multiple pheromones bring inspirations to explain the social insects’ behaviours will become an interesting question. Thus, to provide a reliable system to undertake the multiple pheromone study, in this paper, we specifically proposed and realised a multiple pheromone communication system called ColCOSPhi. This system consists of a virtual pheromone sub-system wherein the multiple pheromone is represented by a colour image displayed on a screen, and the Colias IV micro-robots platform designed for swarm robotics applications. Two case studies are undertaken to verify the effectiveness of this system: one is the multiple pheromone based on an ant’s forage and another is the interactions of aggregation and alarm pheromones. The experimental results demonstrate the feasibility of ColCOSPhi and its great potential in directing swarm robotics and social insects research.

Tian Liu presenting at the IEEE 4th International Conference on Advanced Robotics and Mechatronics (ICARM) July 2019
Tian Liu presenting at the IEEE 4th International Conference on Advanced Robotics and Mechatronics (ICARM) July 2019

Dr Shyamala Doraisamy Featured in the EURAXESS ASEAN Newsletter

Shyamala Doraisamy is an Associate Professor at the Department of Multimedia, Faculty of Computer Science and Information Technology, University Putra Malaysia (UPM). UPM is a partner in the ULTRACEPT project and Dr. Doraisamy is UPM’s Partner Lead.

Dr. Doraisamy featured in the final EURAXESS ASEAN Newsletter for 2019. The article explained how UPM became involved in the ULTRACEPT project and what their role has been in the consortium.

Dr. Doraisamy received her PhD from Imperial College London in 2004, specializing in the field of Music Information Retrieval and won an award for her music and computing innovation at the Invention and New Product Exposition (INPEX), Pittsburgh, USA in 2007. Her research interest includes Multimedia Information Processing, focusing in particular on sound analysis and has completed several projects on music and health applications. She has been an invited speaker at various conference and research meetings internationally.

Dr. Doraisamy is an active member of the Malaysian Information Retrieval and Knowledge Management Society and was the Chair of the 2018 IEEE 2018 International Conference on Information Retrieval and Knowledge Management Conference (CAMP’18).

During 2019, Dr. Doraisamy has been on secondment at the University of Lincoln (UoL) along with Early Stage Researcher (ESR) Siavash Bahrami and Experienced Researcher (ER) Azreen Azman.

UPM secondees
Shyamala with UPM researchers Azreen Azman (L) and Siavash Bahrami (R)

UPM has been working on the theme of road safety related to the project. The tasks assigned to UPM were mainly based on the work packages, WP2, WP3 and WP4. The team has focused in particular on contributions for task 2.3 in WP2 – ‘To develop long range hazard perception methods coping with low light conditions’.

Dr. Doraisamy secondment has included initial meetings with partners and the completion of proposal discussions for a collaborative PhD research work with Siavash Bahrami. The tasks completed have been based on this collaborative PhD research being co-supervised by UPM and UoL. The research involves investigating the use of sound data for road wetness levels estimation to support the development of long range hazard methods coping with low light conditions.  You can read more about Siavash’s research here.

The team will continue to utilise audiovisual technologies towards the development of Brain-inspired vision systems for long-range hazard perception (WP2).

The article in the EURAXESS ASEAN Newsletter also highlights how participation in an MSCA-RISE can be beneficial for Malaysian research groups and Dr. Doraisamy also provides advice on getting involved in future RISE consortia.

 

International Joint Conference on Neural Networks (IJCNN) July 2019

The 2019 International Joint Conference on Neural Networks (IJCNN) was held at the InterContinental Budapest Hotel in Budapest, Hungary on the 14-19 July 2019. The full Program with Abstracts can be found here.

This conference was attended by  ULTRACEPT researchers from the University of Lincoln, Huatian Wang and Hongxin Wang.

Neural Models of Perception, Cognition and Action

Tuesday, July 16, 5:30PM-7:30PM

Hongxin Wang presented the following:

Visual Cue Integration for Small Target Motion Detection in Natural Cluttered Backgrounds [#19188]

Hongxin Wang, Jigen Peng, Qinbing Fu, Huatian Wang and Shigang Yue, University of Lincoln, United Kingdom; Guangzhou University, China.  

The robust detection of small targets against cluttered background is important for future artificial visual systems in searching and tracking applications. The insects’ visual systems have demonstrated excellent ability to avoid predators, find prey or identify conspecifics – which always appear as small dim speckles in the visual field. Build a computational model of the insects’ visual pathways could provide effective solutions to detect small moving targets. Although a few visual system models have been proposed, they only make use of small-field visual features for motion detection and their detection results often contain a number of false positives. To address this issue, we develop a new visual system model for small target motion detection against cluttered moving backgrounds. Compared to the existing models, the small-field and wide-field visual features are separately extracted by two motion-sensitive neurons to detect small target motion and background motion. These two types of motion information are further integrated to filter out false positives. Extensive experiments showed that the proposed model can outperform the existing models in terms of detection rates.

Hongxin Wang presenting 'Visual Cue Integration for Small Target Motion Detection in Natural Cluttered Backgrounds'
Hongxin Wang presenting ‘Visual Cue Integration for Small Target Motion Detection in Natural Cluttered Backgrounds’ at the Conference on Neural Networks (IJCNN) July 2019
Plenary Poster Session POS2: Poster Session 2

Thursday, July 18, 10:00AM-11:40AM

Huatian Wang presented the following:

P333 Angular Velocity Estimation of Image Motion Mimicking the Honeybee Tunnel Centring Behaviour [#19326]

Huatian Wang, Qinbing Fu, Hongxin Wang, Jigen Peng, Paul Baxter, Cheng Hu and Shigang Yue, University of Lincoln, United Kingdom; Guangzhou University, China

Insects use visual information to estimate angular velocity of retinal image motion, which determines a variety of flight behaviours including speed regulation, tunnel centring and visual navigation. For angular velocity estimation, honeybees show large spatial-independence against visual stimuli, whereas the previous models have not fulfilled such an ability. To address this issue, we propose a bio-plausible model for estimating the image motion velocity based on behavioural experiments of the honeybee flying through patterned tunnels. The proposed model contains mainly three parts, the texture estimation layer for spatial information extraction, the delay-and-correlate layer for temporal information extraction and the decoding layer for angular velocity estimation. This model produces responses that are largely independent of the spatial frequency in grating experiments. And the model has been implemented in a virtual bee for tunnel centring simulations. The results coincide with both electro-physiological neuron spike and behavioural path recordings, which indicates our proposed method provides a better explanation of the honeybee’s image motion detection mechanism guiding the tunnel centring behaviour.

Huatian Wang attending the International Joint Conference on Neural Networks (IJCNN) July 2019
Huatian Wang presenting his poster ‘Angular Velocity Estimation of Image Motion Mimicking the Honeybee Tunnel Centring Behaviour’ at the Conference on Neural Networks (IJCNN) July 2019

UK Neural Computation July 2019

The 2019 UK Neural Computation event was held at the University of Nottingham, United Kingdom on the 2nd -3rd July 2019. The full programme can be viewed here.

This event was attended by ULTRACEPT researchers from the University of Lincoln Hongxin Wang, Jiannan  Zhao, Fang Lei, Hao Luan and Xuelong Sun.

As well as attending presentations at the event, the researchers attended a tutorial for early PhD student where they learnt a lot about doing research.

Hongxin states “I attended the tutorial,  communicated with researchers who worked on relevant fields such as  computational neuroscience,  and acquired new ideals for further improving robustness of the STMD models and how to simulate feedback mechanism in the STMD neural pathways.”

Jiannan advised that he participated in the meeting discussion, discussed the interesting topic in Neural Computing field.

Modelling the optimal integration of navigational strategies in the insect brain

Xuelong Sun presented a poster at this event. You can view Xuelong’s poster here.

Sun X, Mangan M, Yue S 

Insect are expert navigators capable of searching out sparse food resources over large ranges in complex habitats before relocating their often hidden nesting sites. These feats are all the more impressive given the limited sensing and processing available to individual animals. Recently, significant advances have been made in identifying the brain areas driving specific navigational  behaviours, and their functioning, but an overarching computational model remains elusive. In this study, we present the first biologically constrained, computational model that integrates visual homing, visual compass and path integration behaviours. Specifically, we demonstrate the challenges faced when attempting to replicate visual navigation behaviours (visual compass and visual homing) using the known mushroom body anatomy (MB) and instead propose that the central 54 complex (CX) neuropil may instead compute the visual compass. We propose that the role of the mushroom body (MB) is to modulate the weighting of the path integration and visual guidance systems depending on the current context (e.g. in a familiar or unfamiliar visual surrounding). Finally, we demonstrate that optimal integration of directional cues can be achieved using a biologically realistic ring attractor network.

Xuelong Sun at the UK Neural Computation 2019 in Nottingham 'Modelling the optimal integration of navigational strategies in the insect brain'
Xuelong Sun at the UK Neural Computation 2019 in Nottingham ‘Modelling the optimal integration of navigational strategies in the insect brain’

 

 

ULTRACEPT Researchers Present at 15th International Conference on Artificial Intelligence Applications and Innovations (AIAI) May 2019

The 2019 15th International Conference on Artificial Intelligence Applications and Innovations AIAI was held on the 24th -26th May 2019  in Crete, Greece at the Knossos Royal Beach Resort. The detailed program can be found here.

The conference was attended by the ULTRACEPT researchers Hongxin Wang and Jiannan Zhao from the University of Lincoln and Xingzao Ma from Lingnan Normal University.

Jiannan Zhao presenting 'An LGMD Based Competitive Collision Avoidance Strategy for UAV' at the 15th International Conference on Artificial Intelligence Applications and Innovations (AIAI) May 2019
Jiannan Zhao presenting ‘An LGMD Based Competitive Collision Avoidance Strategy for UAV’ at the 15th International Conference on Artificial Intelligence Applications and Innovations (AIAI) May 2019
AIAI Session 2: (AUV-LE) Autonomous Vehicles-Learning

Jiannan Zhao presented at this conference.

Room B: 10:30- 11:45

Jiannan Zhao, Xingzao Ma, Qinbing Fu, Cheng Hu, Shigang Yue

An LGMD Based Competitive Collision Avoidance Strategy for UAV

Abstract. Building a reliable and efficient collision avoidance system For unmanned aerial vehicles (UAVs) is still a challenging problem. This research takes inspiration from locusts, which can y in dense swarms for hundreds of miles without collision. In the locust’s brain, a visual pathway of LGMD-DCMD (lobula giant movement detector and descending contra-lateral motion detector) has been identified as collision perception system guiding fast collision avoidance for locusts, which is ideal for designing artificial vision systems. However, there is very few works investigating its potential in real-world UAV applications. In this paper, we present an LGMD based competitive collision avoidance method for UAV indoor navigation. Compared to previous works, we divided the UAV’s field of view into four subfields each handled by an LGMD neuron. Therefore, four individual competitive LGMDs (C-LGMD) compete for guiding the directional collision avoidance of UAV. With more degrees of freedom compared to ground robots and vehicles, the UAV can escape from collision along four cardinal directions (e.g. the object approaching from the left-side triggers a rightward shifting of the UAV). Our proposed method has been validated by both simulations and real-time quadcopter arena experiments.

Jiannan Zhao presenting 'An LGMD Based Competitive Collision Avoidance Strategy for UAV' at the 15th International Conference on Artificial Intelligence Applications and Innovations (AIAI) May 2019
Jiannan Zhao presenting ‘An LGMD Based Competitive Collision Avoidance Strategy for UAV’ at the 15th International Conference on Artificial Intelligence Applications and Innovations (AIAI) May 2019
AIAI Session 10: (AG-MV) Agents-Machine Vision

Hongxin Wang presented at this conference.

Room C: 12:00-13:15

Constant Angular Velocity Regulation for Visually Guided Terrain Following

Huatian Wang, Qinbing Fu, Hongxin Wang, Jigen Peng, Shigang Yue

Insects use visual cues to control their flight behaviours. By estimating the angular velocity of the visual stimuli and regulating it to a constant value, honeybees can perform a terrain following task which keeps the certain height above the undulated ground. For mimicking this behaviour in a bio-plausible computation structure, this paper presents a new angular velocity decoding model based on the honeybee’s behavioural experiments. The model consists of three parts, the texture estimation layer for spatial information extraction, the motion detection layer for temporal information extraction and the decoding layer combining information from pervious layers to estimate the angular velocity. Compared to previous methods on this field, the proposed model produces responses largely independent of the spatial frequency and contrast in grating experiments. The angular velocity based control scheme is proposed to implement the model into a bee simulated by the game engine Unity. The perfect terrain following above patterned ground and successfully flying over irregular textured terrain show its potential for micro unmanned aerial vehicles’ terrain following.

Development of an Autonomous Flapping Wing Robot Locust – Linloc

Hamid Isakhani received B.Eng. in Aeronautics from Visvesvaraya Technological University, and an MSc by Research in Aerospace from Cranfield University in 2015 and 2017, respectively. He was an intern engineer at the Indian Space Research Organisation and Hindustan Aeronautics Limited during the years 2012 and 2014, respectively. He is currently a PhD scholar at the School of Computer Science, University of Lincoln since 2017, and visited Tsinghua University as a Marie Skłodowska-Curie Fellow from 2018-19. As of July 2019, He is seconded to the Huazhong University of Science and Technology to support the progress of the following Tasks and Work Packages included as part of the Project –Ultracept funded by the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska -Curie grant agreement 778062:

  • WP4Systems integration, miniaturization, verification and demonstration

Task 4.4: To build a demonstrator system for the collision avoidance

Ultimately, my doctoral project aims to develop a locust-inspired articulated wing robotic platform capable of performing autonomous flight. This platform shall serve as a demonstrator/testbed to validate the robustness of the vision based collision avoidance system developed as part of the first three work packages included in the projects Step2Dyna and Ultracept. Additionally, completion of this project shall result in the development of a fully-bioinspired flapping wing micro aerial vehicle that is the first of its kind in terms of being entirely inspired by an insect’s flight mechanics, aerodynamics, and avionics.

Significant progress was made at our partner institution, The State Key Laboratory of Digital Manufacturing Equipment and Technology, HUST. Rapid prototyping, manufacturing, and motion capture analysis were some of the vital stages of our project facilitated at this institution.

Hamid Isakhani working on additive manufacturing of locust wing prototypes
Hamid Isakhani working on additive manufacturing of locust wing prototypes

To study the flapping mechanism and gliding behaviour of swarming locusts, we setup a small locust colony at HUST, where we bred adult locusts carefully selected for their physical characteristics indicative of health (strong free-flight ability, good wing condition, etc.).

Small locust colony housing approximately 300 adult Schistocerca gregaria
Small locust colony housing approximately 300 adult Schistocerca gregaria

Specimen are anesthetized in a CO2 chamber for 5 minutes to ease the inlaying of custom-made micro sized retroreflective markers. Each hemispherical marker weighs less than 0.1±0.05mg and 0.5mm in radius to facilitate precise tracking and centroid calculation by the cameras.

CO2 Chamber and retroreflector marked locust
CO2 Chamber and retroreflector marked locust

Generally, due to the resolution limitations of high-speed cameras, capture volumes are highly cramped. Therefore, there is very limited literature on the study of insect free-flight kinematics and their swarm behavior, i.e. the flying insects under experimentation are either tethered or flown in confined flight chambers. However, with the help of a set of well calibrated infrared-based Vicon motion capture system (Vicon, Oxford, UK), consisting of three MX-T160 and four MX-T40s cameras arranged to provide a tracking volume of 0.6×0.6×0.6m, the three-dimensional position and orientation data of a micro swam of gliding locusts are successfully recorded.

Vicon motion capture system consisting of MX-T160 and MX-T40s cameras
Vicon motion capture system consisting of MX-T160 and MX-T40s cameras

Nevertheless, the scientific input and assistance provided by the project director Prof. Yue and the host faculty, Prof. Xiong played a key factor in accomplishing the afore mentioned tasks.

Furthermore, the measured data vectors must be post-processed to derive the kinematic information required to integrate and design an efficient bioinspired wing articulation mechanism mimicking a gliding locust airborne.

Road pavement condition detection from Acoustic signals: A Deep Learning approach

Siavash Bahrami is undertaking his PhD at project partner University Putra Malaysia (UPM). Siavash is currently on secondment to the University of Lincoln as an Early Stage Researcher. He is supporting the following Work Packages and Tasks as part of the Ultracept Project which is funded by European Union’s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant agreement 778062:

  • WP2 – Brain-inspired vision systems for long-range hazard perception

Task 2.3: To develop long range hazard perception methods coping with low light conditionsSiavash Bahrami standing by car

During my secondment at the University of Lincoln, I am working on the development of a neural network model utilizing acoustic signals. This research will be useful where sound data for road wetness levels estimation would be able to support the development of long-range hazard methods coping with low light conditions.

Attaching microphone to vehicle
Attaching microphone to vehicle

My preliminary experiment was performed on a dataset of tyre recordings available on “Detecting Road Surface Wetness from Audio with Recurrent Neural Networks” to evaluate the proposed CNN architectures. MFCCs were used as acoustic signal features to classify the wet and dry road for preliminary investigation which resulted in 98.7% accuracy.

Microphone recording tyre sound
Microphone recording tyre sound
Microphone in vehicle recording road sound
Microphone in vehicle recording road sound

In the next phase of my study, I’m going to develop a larger dataset, where recording equipment has been purchased and data collection has just begun on several roads in Lincoln.

I am also looking forward to my next Ultracept secondment with the project partner Visomorphic Technology Ltd.

This Marie Sklodowska-Curie secondment has given me access to facilities and recording equipment needed for compiling the dataset needed for both my PhD studies and the Ultracept project. In addition, the weekly meetings with other members of the project has given me the opportunity to discuss and broaden my knowledge in various related research fields.