Object-Oriented Modeling and Coordination of Mobile Robots

Nowadays, industrial robots play an important role automating recurring manufacturing tasks. New trends towards Smart Factory and Industry 4.0 however take a more product-driven approach and demand for more flexibility of the robotic systems. When a varying order of processing steps is required, intra-factory logistics has to cope with the new challenges. To achieve this flexibility, mobile robots can be used for transporting goods, or even mobile manipulators consisting of a mobile platform and a robot arm for independently grasping work pieces and manipulating them while in motion. Working with mobile robots however poses new challenges that did not yet occur for industrial manipulators: First, mobile robots have a greater position inaccuracy and typically work in not fully structured environments, requiring to interpret sensor data and to more often react to events from the environment. Furthermore, independent mobile robots introduce the aspect of distribution. For mobile manipulators, an additional challenge arises from the combination of platform and arm, where platform and arm, but also sensors have to be coordinated to achieve the desired behavior.

The main contribution of this work is an approach that allows the object-oriented modeling and coordination of mobile robots, supporting the cooperation of mobile manipulators. Within a mobile manipulator, the approach allows to define real-time reactions to sensor data and to synchronize the different actuators and sensors present, allowing sensor-aware combinations of motions for platform and arm. Moreover, the approach facilitates an easy way of programming, provides means to handle kinematic restrictions or redundancy, and supports advanced capabilities such as impedance control to mitigate position uncertainty. Working with multiple independent mobile robots, each has a different knowledge about its environment, based on the available sensors. These different views are modeled, allowing consistent coordination of robots in applications using the data available on each robot. To cope with geometric uncertainty, sensors are modeled and the relationship between their measurements and geometric aspects is defined. Based on these definitions and incoming sensor data, position estimates are automatically derived. Additionally, the more dynamic environment leads to different possible outcomes of task execution. These are explicitly modeled and can be used to define reactive behavior. The approach was successfully evaluated based on two application examples, ranging from physical interaction between two mobile manipulators handing over a work-piece to gesture control of a quadcopter for carrying goods.

published 08.02.2017 in: Augsburg Publikationsserver OPUS der Universitätsbibliothek Augsburg


For questions regarding the publication, please contact!