Soft Origami Continuum Robot Capable of Precise Motion through Machine Learning

Jian Tao, Tianheng Li, Qiqiang Hu*, Erbao Dong*

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

Abstract

Soft origami continuum robots show great potential compared with traditional rigid robots because of their hyper-redundant deformation. However, motion control of these robots remains challenging because of their nonlinear kinematics. This paper presents a method based on the multilayer perceptron (MLP) neural network to learn the inverse kinematics of a soft origami continuum robot and make the robot follow the desired motion trajectories. The high compressibility of the origami continuum robot allows the robot to work on different surfaces with thickness variation. The data set comprises 30,240 pairs of valid data (tendon length and tip position). Validation experiments are performed based on static points and typical trajectories (circle, square, eight-shaped curve, lines, and heptagonal spatial curve). Results show that the soft origami robot can achieve precise motion control through the MLP neural network without any sensory feedback. Additionally, the study shows the generalization ability of the developed MLP neural network to move in the workspace outside the data set. The robot has an average position error of approximately 3 mm (1.75% relative to the robot's length) over the workspace. © 2024 IEEE.
Original languageEnglish
Pages (from-to)1034-1041
JournalIEEE Robotics and Automation Letters
Volume10
Issue number2
Online published13 Dec 2024
DOIs
Publication statusPublished - Feb 2025

Research Keywords

  • inverse kinematics
  • neural network
  • origami robots
  • Soft robotics

Fingerprint

Dive into the research topics of 'Soft Origami Continuum Robot Capable of Precise Motion through Machine Learning'. Together they form a unique fingerprint.

Cite this