Filtern
Erscheinungsjahr
- 2023 (36) (entfernen)
Dokumenttyp
- Video (21)
- Konferenzveröffentlichung (10)
- Wissenschaftlicher Artikel (3)
- Masterarbeit (1)
- Bericht (1)
Volltext vorhanden
- ja (36) (entfernen)
Schlagworte
- Bionik (3)
- 360° Panorama (1)
- Big Data (1)
- Competency-Oriented Exams (1)
- Continuous Assessment (1)
- Controlling (1)
- Flipped Classroom (1)
- Formative Assessment (1)
- Handel (1)
- Interactive Voting Systems (1)
Institut
360° and IR- Camera Drone Flight Test: Superimposition of two data sources for Post-Fire Inspection
(2023)
This video highlights a recent flight test carried out in our cutting-edge robotics lab, unveiling the capabilities of our meticulously crafted thermal and 360° camera drone! We've ingeniously upgraded a DJI Avata with a bespoke thermal and 360° camera system. Compact yet powerful, measuring just 18 x 18 x 17 cm, this drone is strategically engineered to effortlessly navigate and deliver crucial thermal and 360° insights concurrently in post-fire or post-explosion environments.
The integration of a specialized thermal and 360° camera system enables the simultaneous capture of both data sources during a single flight. This groundbreaking approach not only reduces inspection time by half but also facilitates the seamless superimposition of thermal and 360° videos for comprehensive analysis and interpretation.
From the 360° images of the former video (
• German rescue robotic center captured... ) we now generate the 3D point cloud. The UAV needs 3 minutes to capture the outdoor scenario and the hall from inside and outside. The 3D point cloud generation is 5x slower than the video. It uses a VSLAM algorithm to localize the k-frames (green) and with 3 k-frames it use a 360° PatchMatch algorithm implemented at a NVIDIA graphic card (CUDA) to calculated the dense point clouds.The hall ist about 70 x 20 meters.
The concept of “Internationalisation at Home“ has gained momentum with the increasing digitalization of education and limitations on mobility. Collaborative Online International Learning (COIL) is an innovative, cost-effective instructional method that promotes intercul-tural learning through online collaboration between faculty and students from different countries or locations. The benefits of using COIL courses have been widely recognized, with learners developing intercultural competencies, digital skills, international education experi-ence, and global awareness.
However, multicultural communication in project environments can be complex and demand awareness of cultural variations . The creation and development of effective cross-cultural collectivism, trust, communication, and empathy in leadership is an important ingredient for remote project collaborations success. This is an area that has been least explored in re-search on communication in virtual teams.
The GIPE projects are mainly carried out as so-called Collaborative Online International Learning (COIL) events. However, to gain a “real world“ experience abroad in an intercultural team, students from all partner universities can participate in the Spring School being held for two weeks in Germany and the Germany students present and hand-over the results in the country of the partner university. The main objective of this research was to examine the experiences of students participating in the GIPE project and to evaluate the effectiveness of the project in enhancing intercultural competencies and fostering collaboration among stu-dents from different continents. This paper will also explore the implications of the GIPE project for Education 2.0 considering the COVID-19 pandemic and the future of education delivery and administration transformation.
ARGUS is a tool for the systematic acquisition, documentation and evaluation of drone flights in rescue operations. In addition to the very fast generation of RGB and IR orthophotos, a trained AI can automatically detect fire, people and cars in the images captured by the drones. The video gives a short introduction to the Aerial Rescue and Geospatial Utility System -- ARGUS
Check out our Github repository under
https://github.com/RoblabWh/argus/
You can find the dataset on kaggle under
https://www.kaggle.com/datasets/julienmeine/rescue-object-detection
Biomechanische Untersuchungen zum Öffnungsmechanismus von verholzten Früchten der Gattung Hakea
(2023)
Die Arten H. sericea und H. salicifolia (Proteaceae) sind in Australien heimisch. Ihr natürlicher Lebensraum ist trocken und nährstoffarm, und sie sind regelmäßig Buschbränden ausgesetzt. Durch den Feuchtigkeitsverlust “schrumpft“ die Frucht und öffnet sich, wobei zwei Samen freigesetzt werden. Diese Arbeit vergleicht das Öffnungsverhalten von manipulierten Früchten, das Schwindmaß, die Öffnungskraft, den Elastizitätsmodul und die Druckfestigkeit der beiden Arten und untersucht den Einfluss verschiedener Gewebe auf die Öffnung. Es wird festgestellt, dass das Mesokarp hauptsächlich für das anisotrope Schwindverhalten verantwortlich ist.
This paper presents a pragmatic approach for stepwise introduction of peer assessment elements in undergraduate programming classes, discusses some lessons learned so far and directions for further work. Students are invited to challenge their peers with their own programming exercises to be submitted through Moodle and evaluated by other students according to a predefined rubric and supervised by teaching assistants. Preliminary results show an increased activation and motivation of students leading to a better performance in the final programming exams.
The dataset is used for 3D environment modeling, i.e. for the generation of dense 3D point clouds and 3D models with PatchMatch algorithm and neural networks. Difficult for the modeling algorithm are the reflections of rain, water and snow, as well as windows and vehicle surface. In addition, lighting conditions are constantly changing.
Durch den digitalen Wandel haben Kommunen neue Entwicklungsmöglichkeiten im Bereich der Smart City. Diese Arbeit stellt eine Übersicht darüber dar, wie mithilfe von IoT, Big Data, Datenbanken, Digitalen Zwillingen und weiteren Technologien, eine Mikroklima-Analyse und Steuerung ermöglicht werden kann.
This video features a flight test conducted in our robotics lab, showcasing a custom-built thermal camera drone. We've enhanced a DJI Avata with a specialized thermal camera system. With its compact dimensions measuring 18 x 18 x 17 cm, this drone is designed to navigate and provide critical thermal information within post-fire or post-explosion environments. For more insights, be sure to check out our previous videos on this channel.
Problem
- How to effectively use aerial robots to support rescue forces?
- How to achieve good flight characteristics and long flight times?
- How to enable simple and intuitive control?
- How to efficiently record image data of the environment?
- How to generate flight and image data for rescue forces?
Implementation:
The flying robot was designed in Autodesk Fusion360. In order to achieve high stability as well as low weight, the frame was milled from carbon. Mounts such as for GPS and 360° camera were 3D printed. A special feature is that the flying robot is not visible in the panoramic view of the 360° camera. The flight controller of the robot was set up using Ardupilot. The communication with the robot is done via MAVLink (UDP).To support different platforms, a software was realized as a web application. The front end was created using HTML, CSS and Javascript.
The back end is based on Flask-Socket-IO (Python). For the intelligent recognition of motor vehicles a micro controller with an integrated camera is used. For the post-processing of flight and video data a pipeline was implemented for automation.
Sowohl im Online-, aber auch im stationären Handel sind schon etliche innovative immersive Anwendungen entstanden, die neue kognitive und affektive Interaktions- und Informationsmöglichkeiten bieten. In den Bereichen Kunst, Immobilien, Architektur, Gaming, Fashion, Stadtplanung und -führungen finden sich ebenfalls mehr und mehr AR/VR Anwendungen. In diesem Beitrag wird nach einer Sichtung ausgewählter immersiver Projekte ein Konzept zur Nutzung von AR bzw. VR für Leerstände in einer ehemals attraktiven Einkaufsmeile in Gelsenkirchen vorgestellt.
Gaussian Splatting: 3D Reconstruction of a Chemical Company After a Tank Explosion in Kempen 8/2023
(2023)
The video showcases a 3D model of a chemical company following a tank explosion that occurred on August 17, 2023, in Kempen computed with the gaussian splatting algorithm. Captured by a compact mini drone measuring 18cm x 18cm and equipped with a 360° camera, these images offer an intricate perspective of the aftermath. The computation need 29 minutes and uses 2770 images (~350 equirectangular images). After a comprehensive aerial survey and inspection of the 360° images taken within the facility, authorities confirmed that it was safe for the evacuated residents to return to their homes. See also:
https://www1.wdr.de/fernsehen/aktuelle-stunde/alle-videos/video-grosser-chemieunfall-in-kempen-100.html
At the integration sprint of the E-DRZ consortium in march 2023 we improve the information captured by the human spotter (of the fire brigade) by extending him through a 360° drone i.e. the DJI Avata with an Insta360 on top of it. The UAV needs 3 minutes to capture the outdoor scenario and the hall from inside and outside. The hall ist about 70 x 20 meters. When the drone is landed we have all information in 360° degree at 5.7k as you can see it in the video. Furthermore it is a perfect documentation of the deployment scenario. In the next video we will show how to spatial localize the 360° video and how to generate a 3D point cloud from it.
At the integration sprint of the E-DRZ consortium in march 2023 we improve the information captured by the human spotter (of the fire brigade) by extending him through a 360° drone. The UAV needs 3 minutes to capture the outdoor scenario and the hall from inside and outside. The hall ist about 70 x 20 meters. When the drone is landed we have all information in 360° degree at 5.7k as you can see it in the video. Furthermore it is a perfect documentation of the deployment scenario. In the next video we will show how to spatial localize the 360° video and how to generate a 3D point cloud from it.