Modeling of human robot interaction and use of humanoid robots for rehabilitation training

Modeling of human robot interaction and use of humanoid robots for rehabilitation training

Research Area:

Biomedical and Biologically Motivated Technical Applications

Researchers:

Jindrich Kodl; Albert Mukovskiy; Nick Taubert; Tjeerd Dijkstra; Winfried Ilg; Martin A. Giese

Proposed start date:

2015-02-02

Proposed end date:

2019-05-31

Project page:

http://cogimon.eu/
  

Description:

Human motor control is compliant, i.e. the musculoskeletal system is elastic, and this elasticity is exploited during control. Compliant control is a complex problem for present humanoid robots, and it is critical for a lot of actions with high relevant for everyday life, such as soft catching, soft reception when falling, or sliding and pushing large objects as well as joint actions performed in teams such as manipulation of large scale objects by multiple humans. The overarching objective of the CogIMon project is to advance key technologies that lead to a step-change in cognitive compliant interaction in human-robot teams, integrating physical human-robot interaction, visually guided manipulation and safety integrated design in a systematic way.

Our group focuses in this project on deciphering human interpersonal sensorimotor strategies, and how they accomplish predictive control in interactive tasks. This requires a better understanding of the cognitive aspects that are essential for the control of compliant motion, i.e. extraction and prediction of task-relevant information from the motion (kinematics) of the interaction partner and the generation of predictive models of partner’s motions using kinematic motion tracking data.

For this purpose, we study extensively catching throwing scenarios and their adaptation to rehabilitation training, especially of ataxia patients. Exploiting models for the online synthesis of full-body motion, and for integrating them with balance control of humanoid robots, we are implementing a catching throwing game for the humanoid robot CoMan.

As intermediate step, we have integrated the CoMan robot simulator in a UNITY based virtual environment. This allows us to test and optimize controllers, including a full simulation of the physics of the robot, in a virtual environment before we test the real system. This is critical since patients are not tolerating large amounts of failure of  the technical system in rehabilitation scenarios. On addition, this allows us to evaluate the benefits of  interactive robotics-based training partially already before the real robot system becomes available.

 

Videohttps://www.youtube.com/watch?v=tTtS92e1q4c

 

System architecture:   In collaboration with Techn. Univ. Braunschweig the overall system was implemented in a unified software framework that integrates real-time control (OROCOS), the UNITY game engine, and the Gazebo physics simulator of the COMAN robot. A Virtual Reality (VR) environment was implemented using the HTC Vive system, which combined a Head Mounted Display (HMD) with two motion trackers with a sampling frequency of 90 Hz. The system update time via the RSB communicator was about 1 ms and the controller running with an update time of 1 ms. 

Orocos

Figure: System architecture embedded using unifying OROCOS real-time software environment.

 

Training scenario: Cerebellar ataxia patients benefit substantially from coordinative motor training in everyday-related situations. Administering such training with sufficient intensity is a substantial problem for cost-efficient rehabilitation. One effective exercise for coordination training is throwing and catching of balls (juggling with two people). In our scenario, one person was replaced by an avatar. Participants had to catch ball thrown by the robot. Throwing style and variability was adjusted in a manner that was suited for the skill level of the individual patients.

 

Video: VR training for rehabilitation. Ball throwing-catching game.

 patients performance improvement

Figure: The learning of the task across sessions plotted for 2 cerebellar ataxia patients: DS and OA.

 

VideoBall throwing-catching game in VR. The general view and the patient view (inset).

Publications

Kodl, J., Mukovskiy, A., Mohammadi, P., Malekzadeh, M., Taubert, N., Christensen, A. et al (2019). Online planning and control of ball throwing by the humanoid robot COMAN and validation exploiting VR in rehabilitation scenarios with ataxia patients. Oral presentation and extended abstract in Proc. of CYBATHLON Symposium on Assistive and Wearable Robotics (AsWeR 2019). 16–17 May, 2019, Karlsruhe . [More] 
Mohammadi, P., Malekzadeh, M. S., Kodl, J., Mukovskiy, A., Wigand, D., Giese, M. A. et al. (2018). Real-Time Control of Whole-Body Robot Motion and Trajectory Generation for Physiotherapeutic Juggling in VR. IROS2018, 270-277. [More] 
Chiovetto, E., Curio, C., Endres, D. & Giese, M. A. (2018). Perceptual integration of kinematic components in the recognition of emotional facial expressions. Journal of Vision, 18 (4):13, 1-19. [More] 
Chiovetto, E., Huber, M., Sternad, D. & Giese, M. A. (2018). Low-dimensional organization of angular momentum during walking on a narrow beam. Sci Rep., 8(1), 95. [More] 
Kodl, J., Mukovskiy, A., Dijkstra, T., Brötz, D., Ludolph, N., Taubert, N. et al (2017). Ball Throwing Games in Virtual Reality for Motor Rehabilitation. IX Iberoamerican Congress in Assistive Technology, Iberdiscap, Bogota, Colombia, ISSN 2619-6433 . [More]