National Science Foundation award header

Dr. Jose Pons Receives National Science Foundation Award

Source

National Science Foundation

Congratulations Dr. Pons!

Body

The National Science Foundation (NSF) has awarded Dr. Jose Pons with the NRI-National Robotics Initiative 2.0 grant starting September 2020. Dr. Pon’s grant award totals $1.4 million and will fund his proposal “Co-Robot Controllers for Human-Like Physical Interaction and Improved Motor Learning.”

The vision of this work is to understand the underlying mechanisms of human dyadic physical interaction that lead to improvement in motor performances and learning rates, and to integrate this knowledge in developing new robotic controllers. The infrastructure of the controller and findings will be shared as open-source, and educational programs will be developed to support the advancement of the field.

Award Abstract

Body

Human-like robots, which perform shared tasks with humans in common workspaces, have increased application in home and office services, in rehabilitation and skill training, and in capability enhancement. Digital technologies have enabled these ubiquitous robots with easier and faster human-like audio-visual interactions. However, the lack of physical interaction has posed a significant barrier to human-robot shared tasks, which is essential in applications that require complex physical interactions during the shared tasks. A great number of studies on physical interaction between human dyads or pairs have been conducted during the past decade. Several studies showed that dyadic physical interaction improves the performance of two individuals working together on shared tasks compared to working alone. Amazingly, results showed that human dyads not only perform better, but also learn new tasks faster. Therefore, understanding how human dyads physically interact can be applied while provide insight in developing human-like robots, which to mimic human behavior when performing shared tasks with other humans or robots. The vision of this work is to understand the underlying mechanisms of human dyadic physical interaction that lead to improvement in motor performances and learning rates, and to integrate this knowledge in developing new robotic controllers. Additionally, the infrastructure of the controller and findings will be shared as open source, and educational programs will be developed to support the advancement of the field.

There are many factors that define the physical interaction between dyads, such as the interactive behavior (i.e., collaboration, competition, cooperation), haptic connection (i.e., impedance levels), and skill levels of the dyads (i.e., novice-novice and expert-novice). Therefore, a systematic approach that can quantitatively compare each condition is crucial. To realize the vision of this work, a novel exoskeleton-based dyadic interaction infrastructure will be implemented to study physical dyadic interaction with multiple DOF and multiple contact points by providing virtual connections of varying and controllable impedance between the exoskeleton systems. This infrastructure will be utilized to reveal comprehensive knowledge of how the task performance and motor learning of peers in dyadic haptic interaction are affected by 1) physical interactive behaviors, 2) impedance of the multi-joint virtual connection, and 3) the skill level of peers. Then, new human-like controllers, namely, co3-robot controllers where the suffix co3 refers to robots endowed with collaborative, competitive, and cooperative human-like interactive motor behaviors, will be synthesized based on a force-impedance adaptation model and a neural network feedback error learning model of interacting peers. Finally, the co3-robot controller will be implemented to a lower limb exoskeleton and will be validated by passing a haptic Turing test, showing that the controller is indistinguishable from a human partner during dyadic physical interactions. As a result of this research, this work has the potential to enhance existing tools and devices with a haptic communication modality, thus supporting joint physical action between humans and robots.

Source: NSF Awards