Control of Dynamically Complex Objects: Stability and Predictability
Extending capabilities with tools has been fundamental to human evolution and the advance of robotic devices presents the next stage in human augmentation. However, successfully incorporating an artificial limb into functional multi-degree-of-freedom actions remains a challenge for both the individual and the scientist. How can humans exploit the functionality of external devices, both active and passive? To date, there is still surprisingly little understanding of how humans control their interaction with the vast variety of objects ubiquitous in everyday life. This research studies this problem at the example of guiding a cup of coffee to one’s mouth: The hand applies forces on the cup and the coffee inside, and the sloshing coffee simultaneously exerts forces on the hand. To avoid spilling, subtle control is required to predict and compensate for those complex interaction forces. Using a virtual set-up with a robotic interface, a simple cart-and-pendulum model mimicked the task: participants moved the robotic manipulandum to translate the 2D cart on a horizontal line; the pendulum bob represented the liquid moving inside a cup defined by the bob’s semicircular path. Despite its vast simplification from the real 3D fluid dynamic problem, the model task retained core challenges: it is nonlinear with potentially chaotic behavior presenting interaction forces that are difficult to predict. A series of studies revealed that humans developed strategies that prioritized stability and predictability. In fast point-to-point movements humans developed strategies with sufficient energy margins to preempt the risk of “spilling the coffee”. When a small perturbation was presented at a fixed location along the path, subjects learnt to stabilize their trajectories and attenuated the perturbations by moving through contraction regions of the free, unforced system. When moving the cup back and forth in a continuous rhythmic fashion, the nonlinear interactive dynamics became even more complex and unpredictable. Results showed that humans made the interactions more predictable, sacrificing solutions that would economize on the expended force. When subjects were allowed to choose their own frequency, they exploited the resonance structure of the coupled human-cart-pendulum system, thereby increasing the predictability and stability of their interactions. These findings demonstrate that humans are sensitive to stability and predictability of the task and exploit them to enable safe interaction with dynamically complex objects. These insights can serve as a basis to better understand and develop successful interaction of humans with both passive and active robotic devices.
Dagmar Sternad received her BS in Movement Science and Linguistics from the Technical University and Ludwig Maximilians University of Munich and her PhD in Experimental Psychology from the University of Connecticut. From 1995 until 2008, she was Assistant, Associate, and Full Professor at the Pennsylvania State University in Integrative Biosciences and Kinesiology. Since 2008, she holds an interdisciplinary appointment as full professor in the departments of Biology, Electrical and Computer Engineering, and Physics at Northeastern University in Boston. She is member of the Center for Interdisciplinary Research on Complex Systems at Northeastern. Her research is documented in over 100 peer-reviewed publications, book chapters, and several books. She has had editorial appointments in several scientific journals and was regular member of an NIH study section. Her research has been continuously supported by the National Institute of Health, National Science Foundation, American Heart Association, Office of Naval Research, and others.
Understanding Human Movement to Design Physical Human-Robot Interactions
Rehabilitative and assistive robots are often designed based on mimicking how patients are physically guided and aided by a therapist. However, we actually know little about how therapist interact physically with patients, and what aspects of physical interaction are important, and how those interactions actually help patients recover mobility. There are few studies that have measured forces between two individuals during cooperative or assistive physical interactions. Nor do we know how those receiving guidance or assistance use physical interaction to modify and adapt their movements, or how their motor capabilities and constraints affect the interactions. Therefore, without fundamental principles of physical human-human interactions (pHHI) we cannot establish appropriate design criteria physical human-robot interactions (pHRI). I will present some example of our work using examining physical interaction at the hands during gait in both pHRI and pHHI. We leveraged our experience with rehabilitative partner dance as a model for understanding pHHI because partner dance. Advantages are that motor goals are explicit and can be quantitatively evaluated; pHHI can occur without vision, and the PHHI are assistive and rehabilitative. We show that surprisingly low interaction forces can convey motor communication forces that convey information about motor intent, online motor performance, and the level of skill of the partner. Morevoer, improvement in coordination between partners were more likely to occur when partners were of similar skill level. Our work begins to establish an experimental framework for establishing criteria for successful and intuitive pHRI for gait assistance and rehabilitation based on understanding how humans sense, move, and interact physically.
Dr. Ting studied mechanical engineering at the University of California at Berkeley (BS) and at Stanford University (MSE, PhD). Her postdoctoral training was in neurophysiology at the University of Paris V and Oregon Health and Sciences University. Her research in neuromechanics focuses on complex, whole body movements such as walking and balance in healthy and neurologically impaired individuals, as well as skilled movements involved in dance and sport. By drawing from neuroscience, biomechanics, rehabilitation, computation, robotics, and physiology her lab has discovered exciting new principles of human movement. Her work has revealed principles of sensorimotor control for gait and balance and how they change in stroke, spinal cord injury, Parkinson’s disease, and with rehabilitation and training. Her work forms a foundation that researchers around the world are using to understand normal and impaired movement control in humans and animals as well as to develop better robotic devices that interact with people. She is currently Professor in the W. H. Coulter Department of Biomedical Engineering at Emory University and Georgia Institute of Technology and Department of Rehabilitation Medicine, Division of Physical Therapy at Emory University and Co-Director of the Georgia Tech Neural Engineering Center.
Control Strategies and Neural Mechanisms Underlying Human Physical Interaction.
In the past few decades, research in human motor neuroscience has made significant progress by providing experimental evidence and theoretical frameworks about mechanisms underlying movement planning, execution, and adaptation. However, it is only recently that motor neuroscience research has started to investigate the mechanisms underlying physical interaction between two human agents. Mechanisms underlying motor control by single and dual human agents share the same biomechanical and neural features. However, physical interaction between two agents poses challenges that are not normally encountered when one agent plans and execute movements of his/her own body segments, i.e., an agent cannot accurately predict the actions/reactions of the other agent to the same extent to which an agent can predict the consequences of motor commands on movement of his/her own limb. In my talk I will give an overview of insights generated by human motor control research, discuss how these can be leveraged to study and understand human physical interaction, and present recent evidence about neural mechanisms mediating joint actions.
Marco Santello received a Bachelor in Kinesiology from the University of L'Aquila, Italy, in 1990 and a Doctoral degree in Sport and Exercise Science from the University of Birmingham (U.K.) in 1995. After a post-doctoral fellowship at the Department of Physiology (now Neuroscience) at the University of Minnesota, he joined the Department of Kinesiology at Arizona State University (ASU) (1999-2010). He is currently Professor of Biomedical Engineering, Director, and Harrington Endowed Chair at the School of Biological and Health Systems Engineering. His main research interests are motor control, learning, haptics, and multisensory integration. His Neural Control of Movement laboratory uses complementary research approaches, ranging from non-invasive neuromodulation transcranial magnetic stimulation to motion tracking, electroencephalography, and virtual reality environments. His work (100+ publications) has been published in neuroscience and engineering journals, and has been supported by the National Institutes of Health, the National Science Foundation, DARPA, the Whitaker Foundation, The Mayo Clinic, and Google. He has served as grant reviewer for US and European funding agencies, and member of the Editorial Board of Transactions on Haptics and The Journal of Assistive, Rehabilitative and Therapeutic Technologies. He is a member of the Society for Neuroscience, the Society of Neural Control of Movement, and IEEE.
Human Sensorimotor Adaptation During Physical Human-Robot Collaboration
Jan Babic is the PI of the Laboratory of Neuromechanics and Biorobotics at "Jožef Stefan" Institute, Slovenia and an Associate Professor at Jožef Stefan International Postgraduate School, Slovenia. His research is particularly concerned with the understanding how human brain controls movement of the body and using this knowledge to design biologically plausible solutions for a broad spectrum of robotic systems such as industrial robots, humanoids, exoskeletons and rehabilitation devices. He is currently involved in two larger European projects; in H2020 SPEXOR as the coordinator, and in H2020 AnDy as the principal investigator. He is also a member of the Management Committee and a Work Group leader of a COST Action Wearable Robots for Augmentation, Assistance or Substitution of Human Motor Functions.
Realistic Human Modeling for Exoskeleton Design in the SPEXOR Project
In this talk I will discuss how good models of human movement that take elementary physical principles as well as human motion capture recordings into account, can be very helpful in designing better exoskeletons. In the European project SPEXOR, the goal is to design passive and active spinal exoskeletons to support all kids of motion that are challenging for the lower back. At the same time, the exoskeleton should not obstruct any everyday motions and be comfortable to wear. We combine human and exoskeleton models such that also the interaction between human and robot can be analyzed and improved. I will present the latest contributions of my group to the design of the active exoskeleton.
Dr. Katja Mombaur is a professor at the University of Heidelberg and head of the Optimization in Robotics & Biomechanics (ORB) group of the Interdisciplinary Center for Scientific Computing (IWR) as well as the IWR Robotics Lab. Her research interests include model-based optimization for studying movements of humans in medical applications (exoskeletons, orthoses and prostheses as well as functional electrical stimulation), sports, cognitive sciences as well as of humanoid robots and other types of dynamic robots. On the mathematical side, she is interested in optimal control, inverse optimal control, non-smooth optimization and multibody system modelling algorithms. She holds a diploma degree in Aerospace Engineering from the University of Stuttgart and a Ph.D. degree in Mathematics from Heidelberg University. She was a postdoctoral researcher in the Robotics Lab at Seoul National University, South Korea. She also spent two years as a visiting researcher in the Robotics department of LAASCNRS in Toulouse. Katja Mombaur is founding chair of the IEEE RAS technical committee Model-based optimization for robotics. She is currently coordinator of the EU project KoroiBot and PI in the EU project MOBOT. She was leading the EU Project ECHORD – GOP and was PI in the Heika-Exo project. In addition, she is PI in the Graduate School HGS MathComp at IWR.
Atalante: A Balanced Dynamic Walking Exoskeleton for Paraplegic Patients
Wandercraft has developed Atalante, the first autonomous exoskeleton with dynamic walking capabilities. Atalante allows spinal cord injury (SCI) patients to stand up and walk without the use of crutches, getting them closer to leading an ordinary life. As Wandercraft gets ready to market Atalante to rehabilitation centers in Europe, we will showcase its capabilities and how they can help physicians, physiotherapists, and foremost patients. We will also discuss the strong link between the latest advances in the humanoid robotics field, and our current and future work.
Stanislas Brossette received the BS degree of Mechanical Engineering from the University of Pierre and Marie Curie, Paris 6, in 2009, and the MS degree of Computational Mechanics from the École Normale Supérieure de Cachan, France in 2011. He then obtained the Ph.D. degree in Robotics in 2016 from the Université de Montpellier, France after spending four years with the Laboratory of Informatics, Robotics, and Microelectronics, Montpellier, France (LIRMM) and the CNRS-AIST Joint Robotics Laboratory (JRL), UMI3218/CRT, Tsukuba, Japan. He then spent one year working as a post-doctoral fellow on the topic of bipedal walking at INRIA Grenoble, France. He currently works at Wandercraft in Paris as a Control-Command R&D Engineer where he develops walking algorithm for the Atalante autonomous exosqueleton. His research interests include multi-contact whole-body posture generation, numerical optimization and perception for robotics.
Estimating, Modeling and Predicting Human Motion During Human-Robot Collaboration
Improved understanding and modeling of human movement can be used to teach robots to perform tasks, allow robots to safely and intuitively interact with humans, and to provide assessment and appropriate assistance to restore and facilitate movement. In this talk, I will describe tools for human motion measurement and analysis suitable for on-line applications. In the first part of the talk, a method for on-line pose estimation that exploits the geometry of the skeletal structure and motion space and can be applied to positional or inertial sensors will be described. In the second part of the talk, an inverse optimal control approach for motion modeling will be introduced. The proposed method creates a generative model of the motion that can be used for motion segmentation and prediction. The proposed approach will be demonstrated in applications focusing on physical human-robot collaboration in manufacturing settings.
Dana Kulić received the combined B.A.Sc. and M.Eng. degrees in electromechanical engineering, and the Ph.D. degree in mechanical engineering from the University of British Columbia, Canada, in 1998 and 2005, respectively. From 2006 to 2009, she was a JSPS Postdoctoral Fellow and a Project Assistant Professor at the Nakamura Laboratory at the University of Tokyo. She is currently an Associate Professor at the Electrical and Computer Engineering Department at the University of Waterloo, Canada. In 2014, she was awarded Ontario’s Early Researcher award for her work on rehabilitation and human-robot interaction. Her research interests include human motion analysis, robot learning, humanoid robots, and human-machine interaction.
Machine Learning for Human-in-the-Loop Optimization of Soft Wearable Robots
Wearable robotic devices have been shown to reduce the energy expenditure of human walking. However, response variance between participants for fixed control strategies can be high, leading to the hypothesis that individualized controllers could further improve walking economy. We developed a method that could rapidly identify optimal control parameters in multi-dimensional space to minimize the metabolic cost of walking. The method uses Bayesian optimization, a sample-efficient and noise tolerant global optimization strategy that is well suited for finding the extrema of objective functions that are noisy and expensive to evaluate. When the HIL Bayesian optimization was used to identify individualized parameters for hip only exosuit, participants reduced metabolic cost by 17.4 ± 3.2% compared with walking without the device (mean ± SEM). We further improved the method by coupling metabolic estimation and parameter selection process using an estimator stopping process. Using the method, a preliminary experimental study (N=2) shows that near-optimal parameters for hip and ankle exosuit reduced metabolic cost by 36.0 ± 4.2% compared with walking without the device. These results support the hypothesis that individualized assistance can improve walking economy. Also, HIL Bayesian optimization with estimator stopping process can make more efficient use of experimental time.
Myunghee Kim joined as an assistant professor at UIC Department of Mechanical and Industrial Engineering. Myunghee Kim’s primary research focus is the development of assistive robotic devices for improving mobility and quality of life through integrative approaches of numerical dynamic models, machine learning techniques, experimental testbeds, and controlled human- subject experiments. She received her M.S. degrees from Korea Advanced Institute of Science (KAIST) and Technology and Massachusetts Institute of Technology (MIT), a Ph.D. degree from Carnegie Mellon University and held a post-doctoral appointment at Harvard University. She was a control engineer in humanoid robotics at Samsung. She received Best Paper Award in the Medical Devices category at ICRA 2015.
Ergonomic Control of Human-Robot Co-Manipulation
The talk will present a control approach to human-robot co-manipulation that accounts for human ergonomics and physical fatigue. To achieve an adaptive and context-aware robot behaviour in physical interaction with human and environment, we developed a control framework that includes a hybrid interaction controller and a multi-modal human-robot interface. The robot then uses this lower level control framework in conjunction with the proposed higher level methods that can estimate and anticipate human states, which are related to ergonomics and physical fatigue. In this direction, real-time measurement systems and dynamical models are used to monitor the human online, while the task is collaboratively performed with the robot. Finally, the robot uses the proposed methods to control its own behaviour in a way that it offloads the excessive effort of the human and ensures ergonomic working conditions.
Luka Peternel received a Ph.D. in robotics from University of Ljubljana, Slovenia in 2015. He conducted his Ph.D. studies at Department of Automation, Biocybernetics and Robotics, Jožef Stefan Institute in Ljubljana from 2011 to 2015, and at Department of Brain-Robot Interface, ATR Computational Neuroscience Laboratories in Kyoto, Japan in 2013 and
2014. He was with Human-Robot Interfaces and Physical Interaction Lab, Advanced Robotics, Italian Institute of Technology in Genoa, Italy from 2015-2018, and he is now an Assistant Professor in the Department of Cognitive Robotics at TU Delft. His research areas include: Human-in-the-Loop Robot Control, Physical Human-Robot Interaction, Exoskeleton Control, Human Motor Control and Computational Neuroscience.
Understanding Intention in Motion for Human Robot Collaboration
Professor Croft joined Monash in January 2018 as Dean of Engineering. Previously, she was with the University of British Columbia (UBC), where she was Senior Associate Dean, Faculty of Applied Science, Professor, Department of Mechanical Engineering, and Marshall Bauder Professor in Engineering Economics. She has a PhD in Mechanical Engineering from the University of Toronto and a Master of Applied Science from the University of Waterloo in Canada. Professor Croft is recognised internationally as an expert in the field of human-robot interaction. As principal investigator for a world-class robotics lab within UBC, she has successfully led large-scale collaborative research projects utilising robots alongside people in manufacturing, and guided multidisciplinary initiatives with General Motors, the DLR (German Aerospace Centre) and other industry partners. Professor Croft has an exceptional record of advancing women’s representation and participation in engineering. Most recently, as the Natural Sciences and Engineering Research Council Chair for Women in Science and Engineering, she worked with partners in funding agencies, industry, academe, and the education system on comprehensive strategies to improve women’s participation and retention in the STEM disciplines at all level. Her outstanding contributions to education and research have earned Professor Croft considerable acclaim, including the NSERC Accelerator Award from the Natural Sciences and Engineering Research Council of Canada (2007-10), the Alan Blizzard Award, Society for Teaching and Learning in Higher Education in 2008, the Women of Distinction Award in Education, Training and Development from the Vancouver YWCA in 2013 and WXN’s top 100 most powerful women in Canada 2014. She is a fellow of the ASME, Engineers Canada and the Canadian Academy of Engineerin