Visual Cues: Integration of object pose recognition with an augmented reality system as means to support visual perception in human-robot control

  • Autonomy and self-determination are fundamental aspects of living in our society. Supporting people for whom this freedom is limited due to physical impairments is the fundamental goal of this thesis. Especially for people who are paralyzed, even working at a desk job is often not feasible. Therefore, in this thesis a prototype of a robot assembly workstation was constructed that utilizes a modern Augmented Reality (AR)-Head-Mounted Display (HMD) to control a robotic arm. Through the use of object pose recognition, the objects in the working environment are detected and this information is used to display different visual cues at the robotic arm or in its vicinity. Providing the users with additional depth information and helping them determine object relations, which are often not easily discernible from a fixed perspective. To achieve this a hands-free AR-based robot-control scheme was developed, which uses speech and head-movement for interaction. Additionally, multiple advanced visual cues were designed that utilize object pose detection for spatial-visual support. The pose recognition system is adapted from state-of-the-art research in computer vision to allow the detection of arbitrary objects with no regard for texture or shape. Two evaluations were performed, a small user study that excluded the object recognition, which confirms the general usability of the system and gives an impression on its performance. The participants were able to perform difficult pick and place tasks with a high success rate. Secondly, a technical evaluation of the object recognition system was conducted, which revealed an adequate prediction precision, but is too unreliable for real-world scenarios as the prediction quality is highly variable and depends on object orientations and occlusion.

Download full text files

Export metadata

Additional Services

Share in Twitter Search Google Scholar
Metadaten
Author:Tim Dierks
URN:urn:nbn:de:hbz:1010-opus4-37602
Referee:Jens Gerken, Gregor Lux
Document Type:Master's Thesis
Language:English
Date of Publication (online):2020/08/11
Year of first Publication:2020
Publishing Institution:Westfälische Hochschule Gelsenkirchen Bocholt Recklinghausen
Granting Institution:Westfälische Hochschule Gelsenkirchen Bocholt Recklinghausen
Date of final exam:2020/07/13
Release Date:2020/08/11
Tag:Augmented Reality; Hands-free Interaction; Human-Robot Interaction; Object Recognition
GND Keyword:Erweiterte Realität <Informatik>
Pagenumber:III, 92
Departments / faculties:Fachbereiche / Informatik und Kommunikation
Licence (German):License LogoCreative Commons - Namensnennung

$Rev: 13159 $