Internet of Skills

How can we ad­vance training tech­nolo­gies?

The Internet has already democratized access to information worldwide, allowing every person to obtain almost any information. But what about the learning of skills? Instructrional videos on specific skills can help to a certain extent, but an expert showing something in person and giving feedback provides the best learning results. But what if the expert doesn’t even live close enough to take lessons with ease? Or even on another continent? This is where CeTI tries to provide an answer based on research, so that skills can be learned even over extremely long distances.

Purple and magenta graphic illustration depicting two people engaging in different activities

Music

This scenario addresses learning to play piano as a proof of concept, whereby an expert musician trains others remotely through the Internet of Skills in real-time. In order to achieve this remote training, novel wearable actuators in form of smart eGloves with haptic features and low latency are used. The piano teacher, wearing an eGlove with self-sensing capabilities or using a digital piano, would control each finger of the student in real time, in contrast to explaining each step verbally or demonstrating the movement visually. The student wears an eGlove with an integrated haptic force-feedback system. This would enable the student to experience the correct finger movements and it would be easier to reproduce the correct behaviour later on.

A person playing a grand piano while being connected to a haptic force-feedback system
A person playing a grand piano while being connected to a device. Behind the piano is a projection of the digital avatar mirroring the person playing the piano.

Demonstrator Sound and Science

The demonstrator illustrates that it is possible to transmit the movements of the piano player to a digital avatar in real time. Accurate capture of expert movements enables the further steps in which these movements are transmitted to the learner through a haptic force-feedback system, thus facilitating reproduction. The demonstrator was shown on the “Sound and Science” event as part of the Dresden Music Festival.

Aus Datenschutz-Gründen benötigen wir Ihre Zustimmung, um die Daten zu laden. Mehr Informationen finden Sie im Impressum. For more details, please see our Imprint.

Sports

In this scenario, a dynamic multimodal set-up in the domain of sports, specifically surfing, is chosen. Due to the dynamic and complex relevant motion sequences, mainly balancing on the board in the water, no supervised human haptic interaction is possible. Therefore, the input data has to be recorded and processed using machine learning methods and stored prior to the learning procedure: an expert / trainer wears an eBodySuit system, equipped with tactile and position sensing features to generate the complex movement specifications as a reference. For the learning procedure, the human trainee wears a multimodal eBodySuit (with auditory, visual, and haptic feedback) linked to the Internet of Skills, that teaches the learner the precise dynamic and complex motion sequences.

Surfboard Demonstrator

The demonstrator captures the movements of the person on the surfboard and transfers them to a digital avatar, which copies the movements in quasi-real time. On the screen, the surfers get immediate feedback through green and red colored areas whether they are in the right position on the board. The surfboard is built in such a way that it causes the slightest movements to wobble, which is supposed to resemble the real surfing experience. The demonstrator was showcased on various occasions, e.g. at the Juniordoktor event.

Collaborative Research Projects

Teaching and learning skills over a long distance requires different technologies that are being tested in different research projects that complement and build on each other.

Textile Sensors and Actuators

Textile technology at CeTI deals for example with the functionalization of textiles and knitting technology. It gives a wide range of possibilities to enhance textiles for different applications, which is constantly expanding. Research is also focused on textile-based sensors and actuators and their conductive behavior. These sensors and actuators bring many advantageous properties concerning wearables. Wearables, smart textiles and devices are the hardware basis for software development and psychological examinations. The CeTI glove and Smart Kinesiotape are the result of research where, textile-physical and electromechanical investigations come together.

Human Motion Capturing

For an exact capturing of human motion the precision and latency of current 3D acquisition and tracking approaches are still insufficient. Immersive Modelling is a promising way to label, segment, and extract semantic properties of 3D objects in VR/AR. The goal is to bridge the physical world and the virtual world. Also providing AR/ VR environment for designing and implementing natural interaction in VR between humans and machines. It also takes a deep understanding of how humans perceive/recognize individuals by their body motions. In addition, the research examines whether humans and machines recognize human motion in the same way or whether they look for different features. Therefore motion data is collected via IMUmotion capture suits and AR/VR headsets. The anonymization of the collected behavioral biometric data is an important issue. By understanding the privacy risks of data collection, a privacy friendly pose estimation can be established.

Tactile Feedback

Feedback from an external source (augmented feedback) is a potent tool for promoting motor skill learning. Feedback can direct a learner towards efficient and task-relevant movement patterns/solutions. An example is lifting a heavy box, where a healthy execution with a straight back and bent legs is important. In this case passive tactile and auditory feedback can reduce spine flexion if the movement is performed incorrectly. Prescriptive feedback on movement processes is superior to feedback on movement outcome.

Room Leaders

Portrait of Chokri Cherif

Prof. Dr.

Chokri Cherif

TU Dresden

Chair of Textile Technology

Portrait of Susanne Narciss

Prof. Dr.

Susanne Narciss

TU Dresden

Chair of Psychology of Learning and Instruction

Portrait of Thorsten Strufe

Prof. Dr.

Thorsten Strufe

TU Dresden

Chair of IT Security, KIT/Chair of Network Security and Privacy