Refine
Document Type
- video (19)
- Conference Proceeding (1)
Keywords
- 360° Panorama (1)
- NeRF (1)
- Rescue Robotics (1)
- Small UAVs (1)
- Visual Monocular SLAM (1)
Problem
- How to effectively use aerial robots to support rescue forces?
- How to achieve good flight characteristics and long flight times?
- How to enable simple and intuitive control?
- How to efficiently record image data of the environment?
- How to generate flight and image data for rescue forces?
Implementation:
The flying robot was designed in Autodesk Fusion360. In order to achieve high stability as well as low weight, the frame was milled from carbon. Mounts such as for GPS and 360° camera were 3D printed. A special feature is that the flying robot is not visible in the panoramic view of the 360° camera. The flight controller of the robot was set up using Ardupilot. The communication with the robot is done via MAVLink (UDP).To support different platforms, a software was realized as a web application. The front end was created using HTML, CSS and Javascript.
The back end is based on Flask-Socket-IO (Python). For the intelligent recognition of motor vehicles a micro controller with an integrated camera is used. For the post-processing of flight and video data a pipeline was implemented for automation.
The video shows a very high resolution 3D point cloud !!! of the outdoor area of the German Rescue Robotics Center. For the recording, a 25-second POI flight was performed with a Mavic 3. From the 4K video footage captured during this flight, 77 images were cropped and localized within 4 minutes using colmap and processed using Neural Radiance Fields (NeRF). The nerfacto model of Nerfstudio was trained on an Nvidia RTX 4090 for 8 minutes. In summary, a top 3D model is available to task forces after about 13 minutes. The calculation is performed locally on site by the RobLW of the DRZ. The video shown here shows a free camera path rendered at 60 hz (Full HD).
Nerf(acto) for the 3D modeling of the Computer Science building of Westfälische Hochschule GE
(2023)
The video shows a very high resolution 3D point cloud !!! of the computer science building of the University of Applied Science Gelsenkirchen. For the recording a 3 minute flight with a M30T was performed. The 105 images taken by the wide-angle camera during this flight were localized within 3 minutes using colmap and processed using Neural Radiance Fields (NeRF). The nerfacto model of Nerfstudio was trained on an Nvidia RTX 4090 for 8 minutes. Thus, a top 3D model is available after about 15 minutes.
The video shown here shows a free camera path rendered at 60 hz (Full HD).
From the 360° images of the former video (
• German rescue robotic center captured... ) we now generate the 3D point cloud. The UAV needs 3 minutes to capture the outdoor scenario and the hall from inside and outside. The 3D point cloud generation is 5x slower than the video. It uses a VSLAM algorithm to localize the k-frames (green) and with 3 k-frames it use a 360° PatchMatch algorithm implemented at a NVIDIA graphic card (CUDA) to calculated the dense point clouds.The hall ist about 70 x 20 meters.
The video shows the first test of a small spherical UAV (35 cm) with 4 rotors for missions in complex environments such as buildings, caves or tunnels. The spherical design protects the vehicle's internal components and allows the UAV to roll over the ground when the environment allows. The drone can land and take off in any position and come into contact with objects without endangering the propellers and can restart even after crashes.
Sperical UAV: Crash Test with 1/2 liter bottle from 2 meters
Gaussian Splatting: 3D Reconstruction of a Chemical Company After a Tank Explosion in Kempen 8/2023
(2023)
The video showcases a 3D model of a chemical company following a tank explosion that occurred on August 17, 2023, in Kempen computed with the gaussian splatting algorithm. Captured by a compact mini drone measuring 18cm x 18cm and equipped with a 360° camera, these images offer an intricate perspective of the aftermath. The computation need 29 minutes and uses 2770 images (~350 equirectangular images). After a comprehensive aerial survey and inspection of the 360° images taken within the facility, authorities confirmed that it was safe for the evacuated residents to return to their homes. See also:
https://www1.wdr.de/fernsehen/aktuelle-stunde/alle-videos/video-grosser-chemieunfall-in-kempen-100.html
The video showcases a 3D model of a chemical company following a tank explosion that occurred on August 17, 2023, in Kempen computed with the AI algorithm Neural Radiance Field (NeRF). Captured by a compact mini drone measuring 18cm x 18cm and equipped with a 360° camera, these images offer an intricate perspective of the aftermath. After a comprehensive aerial survey and inspection of the 360° images taken within the facility, authorities confirmed that it was safe for the evacuated residents to return to their homes. See also:
https://www1.wdr.de/fernsehen/aktuelle-stunde/alle-videos/video-grosser-chemieunfall-in-kempen-100.html
ARGUS is a tool for the systematic acquisition, documentation and evaluation of drone flights in rescue operations. In addition to the very fast generation of RGB and IR orthophotos, a trained AI can automatically detect fire, people and cars in the images captured by the drones. The video gives a short introduction to the Aerial Rescue and Geospatial Utility System -- ARGUS
Check out our Github repository under
https://github.com/RoblabWh/argus/
You can find the dataset on kaggle under
https://www.kaggle.com/datasets/julienmeine/rescue-object-detection