Modeling of human robot interaction and use of humanoid robots for rehabilitation training

Modeling of human robot interaction and use of humanoid robots for rehabilitation training

Research Area:

Biomedical and Biologically Motivated Technical Applications

Researchers:

Jindrich Kodl; Albert Mukovskiy; Nick Taubert; Tjeerd Dijkstra; Winfried Ilg; Martin A. Giese; Alessandro Salatiello

Proposed start date:

2015-02-02

Proposed end date:

2019-05-31

Project page:

http://cogimon.eu/
  

Description:

Human motor control is compliant, i.e. the musculoskeletal system is elastic, and this elasticity is exploited during control. Compliant control is a complex problem for present humanoid robots, and it is critical for a lot of actions with high relevant for everyday life, such as soft catching, soft reception when falling, or sliding and pushing large objects as well as joint actions performed in teams such as manipulation of large scale objects by multiple humans. The overarching objective of the CogIMon project is to advance key technologies that lead to a step-change in cognitive compliant interaction in human-robot teams, integrating physical human-robot interaction, visually guided manipulation and safety integrated design in a systematic way.

Our group focuses in this project on deciphering human interpersonal sensorimotor strategies, and how they accomplish predictive control in interactive tasks. This requires a better understanding of the cognitive aspects that are essential for the control of compliant motion, i.e. extraction and prediction of task-relevant information from the motion (kinematics) of the interaction partner and the generation of predictive models of partner’s motions using kinematic motion tracking data.

For this purpose, we study extensively catching throwing scenarios and their adaptation to rehabilitation training, especially of ataxia patients. Exploiting models for the online synthesis of full-body motion, and for integrating them with balance control of humanoid robots, we are implementing a catching throwing game for the humanoid robot CoMan.

As intermediate step, we have integrated the CoMan robot simulator in a UNITY based virtual environment. This allows us to test and optimize controllers, including a full simulation of the physics of the robot, in a virtual environment before we test the real system. This is critical since patients are not tolerating large amounts of failure of  the technical system in rehabilitation scenarios. On addition, this allows us to evaluate the benefits of  interactive robotics-based training partially already before the real robot system becomes available.

Videohttps://www.youtube.com/watch?v=tTtS92e1q4c

System architecture:   In collaboration with Techn. Univ. Braunschweig the overall system was implemented in a unified software framework that integrates real-time control (OROCOS), the UNITY game engine, and the Gazebo physics simulator of the COMAN robot. A Virtual Reality (VR) environment was implemented using the HTC Vive system, which combined a Head Mounted Display (HMD) with two motion trackers with a sampling frequency of 90 Hz. The system update time via the RSB communicator was about 1 ms and the controller running with an update time of 1 ms. 

Orocos

Figure: System architecture embedded using unifying OROCOS real-time software environment.

 

Training scenario: Cerebellar ataxia patients benefit substantially from coordinative motor training in everyday-related situations. Administering such training with sufficient intensity is a substantial problem for cost-efficient rehabilitation. One effective exercise for coordination training is throwing and catching of balls (juggling with two people). In our scenario, one person was replaced by an avatar. Participants had to catch ball thrown by the robot. Throwing style and variability was adjusted in a manner that was suited for the skill level of the individual patients.

 

Video: VR training for rehabilitation. Ball throwing-catching game.

 patients performance improvement

Figure: The learning of the task across sessions plotted for 2 cerebellar ataxia patients: DS and OA.

 

VideoBall throwing-catching game in VR. The general view and the patient view (inset).

Publications

Chiovetto, E., Salatiello, A., D'Avella, A. & Giese, M. A. (2022). Toward a unifying framework for the modeling and identification of motor primitives. Frontiers in computational neuroscience, 16 926345. [More] 
Mohammadi, P., Hoffman, E. M., Dehio, N., Malekzadeh, M. S., Giese, M. A., Tsagarakis, N. G. et al. (2019). Compliant humanoids moving toward rehabilitation applications: Transparent integration of real-time control, whole-body motion generation, and virtual reality. IEEE Robotics & Automation Magazine, 26(4), 83-93. [More] 
Kodl, J., Yu, C. C., Dijkstra, T. & Giese, M. A (2019). Sensorimotor adaptation to an environment with non-standard physics. The Progress in Motor Control XII: Movement Improvement conference, PMC 2019. Amsterdam, The Netherlands, July 7-10, 2019 . [More] 
Kodl, J., Mukovskiy, A., Mohammadi, P., Malekzadeh, M., Taubert, N., Christensen, A. et al (2019). Online planning and control of ball throwing by the humanoid robot COMAN and validation exploiting VR in rehabilitation scenarios with ataxia patients. Oral presentation and extended abstract in Proc. of CYBATHLON Symposium on Assistive and Wearable Robotics (AsWeR 2019). 16–17 May, 2019, Karlsruhe . [More] 
Dijkstra, T., Kodl, J., Li, C.-Y. & Giese, M. A (2019). Manipulation of internal representations of physics through VR training in an unnatural physical environment. ECVP 2019, Perception 48(2S),104 . [More]