Intelligent Electronic Skin for Human Machine Interactions

Project: Research

View graph of relations

Description

Robots has been gradually playing an important role in the current society from advanced manufacturing, to medical care and to facilitating life. The trend of next generation robots is the integration of multiple and smart sensing systems into on system to form multifunctional and intelligent robots. Human machine interface (HMI) focuses on the interactions between individual users and robots, therefore serves as the key to promote the development of robotics. The integration of sensing and controlling systems into HMI could allow robots executing tasks according to users’ willing and thus realizing applications in various fields. However, the development of such kind of intelligent HMI is still in their infancies. For instance, most of the HMI in robots control is based on manipulation systems, where users manipulate robots via joysticks with the assistance of visual information. There are two major limitations in such HMI. First, the manipulation system is way too big and basically operated by controlling the joystick or similar objects, that is completely different from how we do the tasks via our own body in real situations. Second, the current HMIs are lack of feedback information, and most of the time users can only feel very limited feedback from the manipulation sticks. Therefore, developing an intelligent HMI with smart sensing and feedback systems in a thin, soft, wireless and wearable format is the key to realize future intelligent interactions between human and robots. Here, we propose a new concept of electronic skin HMI based on the materials and devices for realizing thin, soft, wearable and wireless electronics that will incorporate stretchable sensors and skin-integrated haptic actuators for the application in real time close loop robotic control and feedback. Such HMI system will adopt multiple-channel sensors as manipulators to wirelessly sense the activity information of various body locations, i.e. joints and send commend to control robots; while the external physical stimuli on robots, i.e. visual and tactile information will also wirelessly feedback to user. Therefore, such HMI can be applied in a broad range of applications, ranging from medical robots for non-contact testing samples collection, to extreme condition rescue tasks and many others.  

Detail(s)

Project number9043160
Grant typeGRF
StatusActive
Effective start/end date1/01/22 → …