Collaborative welding robots with human interaction over gestures


The Cobots' Relational Interface with Neuromorphic Networks and Events (CORINNE) project aims to build robots that can recognize and respond to gestures (known or unknown), to interact with humans on welding tasks. This technology enables users to interact intuitively and spontaneously with digital interfaces or devices by simply performing gestures, while the software or device immediately recognizes these movements and reacts accordingly. fortiss is using a neuromorphic chip and an event-based HD camera.

Project description

Collaborative robots, or cobots, are a major topic in robotics, with impressive predicted growth rates. However, conventional cobots can only perform pre-programmed actions and cannot react flexibly to unexpected events. fortiss is therefore pursuing the goal of teaching cobots to recognize gestures or hand movements in real time. This ability should then be able to be used in the application field of welding work, whereby the learned knowledge should not be re-taught but transferred.

fortiss aims to realize a complex interactive welding process together with a human operator. To this end, algorithms are being developed for the MAiRA robot from NEURA Robotics. These algorithms use a new type of neuromorphic sensor to recognize human gestures. This enables the robot to intelligently recognize complex patterns and movements and to interact naturally with the operator and react to their gestures in order to optimize the welding process. Concerning the sensor, fortiss can already draw on project experience. This is based on an event-based camera in HD resolution that delivers visual events (pixel changes) to a pre-trained neural spiking network running on a neuromorphic chip (Loihi research chip from Intel).

The following questions are to be researched and addressed in the project:

  • New gestures should be learned during runtime in an unsupervised fashion in a so-called continual learning paradigm. This implies to build a system able to constantly refine its knowledge of learned gestures and to recognize and learn unknown gestures autonomously.
  • The use case envisaged in this project comprises several cobots that have learned a wide variety of gestures over time. The central research question here is how the knowledge of a newly learned gesture can be transferred from one cobot to the others without them also having to be trained for it.

By activating this video, you consent to transmitting data to YouTube.

Copyright © NEURA Robotics GmbH

Research contribution

In practical operation, the recognition and online learning of human gestures in welding applications through the use of a neuromorphic visual sensor will fulfill the following requirements:

  • Adaptation to the user takes place without the need for extensive models and training data, increasing practical suitability and strengthening data sovereignty, as there is no dependence on external data sources.
  • The AI algorithms developed make it possible to learn new gestures and combinations of gestures on-the-fly, i.e., in real time, without static training and with very few repetitions.
  • The new knowledge of a single cobot can be directly shared with other robots through federated learning techniques, developed by the project partner from Technische Universität Chemnitz.


Federal Ministry of Education and Research BMBF
Artificial Intelligence, KI4KMU, online identification: 100655754

Project duration

01.04.2024 – 31.03.2026


 Jules Lecomte

Your contact

Jules Lecomte

+49 89 3603522 188

Project partner