site maker












Features


Autonomous Navigation

QBIIK navigates autonomously and independently in indoor environments


Autonomous gripping with tactile proximity sensors

QBIIK uses innovative sensors


Human - Machine - Interface


Virtual Reality remote control station allows direct control


Learning


QBIIK learns from its challenges

With QBIIK, a system is to be developed and tested that combines the technology of autonomous systems with the capabilities of the human being in a useful way: The decentrally controlled vehicle orients itself in the room, navigates autonomously to the target and reaches for the required goods. 


During the gripping process, the robot's autonomy is supported by various sensors: In addition to the use of a 3D camera, tactile proximity sensors explore the manipulator's surroundings and ensure collision-free gripping. 

If the system is unable to recognize or grasp the goods due to unknown circumstances, it requests human assistance. With a VR human-machine interface the operator and can - for a short period of time - control and carry out the recognition and gripping process. QBIIK learns from the remote assistance and will carry out the steps autonomously in the future.

Partner

BÄR Automation GMBH

STILL GMBH

AUDI Sport GMBH

Karlsruher Institut für Technologie (KIT)

Institut für Fördertechnik und Logistiksysteme (IFL)

Institut für Intelligente Prozessautomation und Robotik (IPR)

This research and development project is funded by the Federal Ministry of Economics and Energy (BMWi) within the technology programme "PAICE Digital Technologies for Business" and is supported by the project management organisation "Information Technologies / Electromobility" at the German Aerospace Center, Cologne.

Autonomous Navigation

QBIIK is intended to enable simple and cost-effective commissioning and integration by the user. For this purpose, localization and navigation methods are to be developed that are oriented towards natural landmarks as far as possible. Data captured by 3D sensors is used to create a map. Instead of lengthy commissioning phases by the manufacturer, QBIIK is to be put into operation by the user himself and adapted if necessary.
The aim is to develop localization and navigation algorithms that allow the use of low-cost imaging sensors. QBIIK is an autonomously operating logistics instance that records changes in the environment itself. The exchange of map material with each other enables collaborative planning and dynamic reactions to changes.

Autonomous grasping with tactile proximity sensors

In QBIIK, a multimodal sensor is to be developed, which can detect force-resolution contact and the approaching of objects. Several sensors together form an artificial skin with which the manipulator or gripper is lined throughout the entire area. Scenario-specific layouts are designed and constructed especially for the gripper. The industrial suitability will be a focal point of development, in terms of mechanical robustness and noise resistance.

Human - Machine - Interface

In contrast to conventional assistance systems, in which the machine is responsible for providing decision-relevant information to humans, the assistant function in the QBIIK project is to be ascribed to humans. The aim is to develop a remote assistance interface that distributes tailored tasks to operators depending on the hardware used. The main goal of QBIIK is to investigate the use of common technologies such as tablets and PC workstations and the industrial application of new technologies in the Augmented Reality (AR) and Virtual Reality (VR) area.

Learning

Autonomous systems have a multitude of problems, such as object recognition or the grip point tuning. For both, classifiers have to be trained with a variety of test data. There are two problems during operation: unclassified objects, i. e. unknown articles, and incorrectly classified objects, i. e. articles that are not recognized. Both are solved using Remote Assistance. It would be desirable not only to solve the problem, but also to prevent it. For this purpose, the data from Remote Assistance: what to grab, how to grab, etc. is used to improve existing classifiers and create new ones. This makes the system more robust and enables it to be continuously expanded without interrupting operations. The system thus adapts itself automatically to a constantly changing range of articles.