Skip to main navigation Skip to search Skip to main content

Synthetic X‑ray‑driven tracking and control of miniature medical devices

Chunxiang Wang, Wenbin Kang*, Mengmeng Sun, Hongchuan Zhang, Chong Hong, Sinan Ozgun Demir, Halim Ugurlu, Kun Hao, Zemin Liu, Tianlu Wang*, Metin Sitti*

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

Abstract

The clinical translation of miniature medical devices (MMDs) for minimally invasive surgery promises transformative advances in biomedical engineering, offering enhanced precision, reduced patient trauma and faster recovery times. However, their effective deployment in complex anatomies under real-time X-ray guidance—a widely used surgical imaging modality—presents challenges such as low imaging quality and difficulties of spatial MMD control. Manual identification and operation are labour intensive and error prone. Meanwhile, deep learning-based automation is limited by the scarcity of annotated X-ray datasets of MMDs owing to costly data collection, laborious annotation and privacy constraints. Here we introduce MicroSyn-X, a framework for training computer vision models to enable robotic teleoperation of MMDs using synthesized high-fidelity, pixel-accurate, auto-labelled and domain-randomized X-ray images, eliminating manual data curation. Integrating MicroSyn-X into a teleoperated robotic system enables real-time localization and navigation of magnetic soft and magnetic liquid MMDs within both ex vivo and dynamic in vivo environments, demonstrating robustness under challenging imaging conditions of low contrast, high noise and occlusion. With these promises, we open source the X-ray MMD dataset to enable benchmarking. Addressing data scarcity and enabling real-time robotic navigation, this work advances MMD-assisted minimally invasive surgery towards next-generation precision interventions. © The Author(s) 2026.
Original languageEnglish
Pages (from-to)276-291
Number of pages16
JournalNature Machine Intelligence
Volume8
Issue number2
Online published23 Feb 2026
DOIs
Publication statusPublished - Feb 2026

Funding

We gratefully acknowledge B. Demirhan, Z. Fouladivanda, G. Michailidis, V. Theodori and D.E.T. Şanlı for their contributions in providing clinical expert annotations. Funding: C.W., C.H., S.O.D., Z.L., T.W., H.Z. and M. Sitti received funding from the Max Planck Society, European Research Council Advanced Grant SoMMoR project (grant number 834531) and the European Research Council Proof of Concept STENTBOT project (grant number 101100727). W.K. received funding from the European Union’s Horizon 2022 research and innovation program under the Marie Skłodowska-Curie Postdoctoral Fellowship (grant agreement number 101109050) and start-up funding (9610735) from the City University of Hong Kong. M. Sun received start-up funding (A-0010108-00-00) from the National University of Singapore. H.U. received the funding from the Zentrum für Radiologie Heilbronn. K.H. received funding from Shandong University. T. Wang received start-up funding from the University of Hawaiʻi at Mānoa. M. Sitti received funding from the German Research Foundation Soft Material Robotic Systems (SPP 2100) Program (grant number 2197/5-1). C.W., Z.L. and M. Sitti received funding from the Max Planck Queensland Center for the Materials Science of Extracellular Matrices.

Publisher's Copyright Statement

  • This full text is made available under CC-BY 4.0. https://creativecommons.org/licenses/by/4.0/

Fingerprint

Dive into the research topics of 'Synthetic X‑ray‑driven tracking and control of miniature medical devices'. Together they form a unique fingerprint.

Cite this