Surveillance and Inspection

This use case shows the developments of the COMP4DRONES project in the field of automatic and autonomous inspections in harsh environments. This involves enhanced sensory capabilities and novel control strategies augmented by real-time data-analytic algorithms.

The main application areas for such drones are the field of industrial inspections and rescue operations in disaster situations. The common denominator of both cases is that both take place in areas that are very difficult to reach. Moreover, that the novel strategies for perception, planning and control deal require the development of efficient computational strategies, dealing with limited computational resources.

In this use case two demonstrators will be implemented, addressing each the specific boundary conditions of the different application areas.

The first demonstrator deals with the inspection of offshore turbines structure. This is realised by means of hyperspectral technology carried by manually flown drones with safety features. The focus here is on the addition of a new sensor (a hyperspectral camera), to improve the quality of observations. In terms of flying, the specific challenge addressed is to have a safe flight, with the avoidance of environmental effects, such as windmill effects. Targeted innovations are: hyperspectral camera in combination with a drone and data-analytics for images to detect imperfections, like corrosion deterioration of paint, real-time or offline, and robust collision avoidance in harsh environments (strong wind, keep safe distance from the jacket). This requires novel algorithms to deal with varying (outdoor) lighting conditions.

The second demonstrator consists of a fleet of robots navigating and mapping in an unknown environment. Imagine a large, partially damaged, building in a disaster area: a shopping mall, a hospital, or an industrial complex. Before rescue workers enter the building, a fleet of drones has mapped the area, monitored hazardous gasses, found safe passage ways, and identified human victims that should be rescued, providing the rescue workers with indispensable information. A challenge here is that in this indoor environment there is no GPS, or the GPS is unreliable. The fleet consists of small, lightweight drones; larger drones with processing, and wheeled rovers. The focus here is on multi-drone collaboration in a GPS denied environment where collaborating drones create a common model of the environment, including automatic detection of points of interest. The multi-drone collaboration includes collaboration between drones with various processing constraints. This collaboration is associated with the exchange of data via communication links, therefore special attention is paid to robust communications.