In this paper we introduce Haptic-ACT, an advanced robotic system for pseudo oocyte manipulation, integrating multimodal information and Action Chunking with Transformers (ACT). Traditional automation methods for oocyte transfer rely heavily on visual perception, often requiring human supervision due to biological variability and environmental disturbances. Haptic-ACT enhances ACT by incorporating haptic feedback, enabling real-time grasp failure detection and adaptive correction. Additionally, we introduce a 3D-printed TPU soft gripper to facilitate delicate manipulations. Experimental results demonstrate that Haptic-ACT improves the task success rate, robustness, and adaptability compared to conventional ACT, particularly in dynamic environments. These findings highlight the potential of multimodal learning in robotics for biomedical automation.
@inproceedings{uriguen2025hapticact,
author={Uriguen Eljuri, Pedro Miguel and Shibata, Hironobu and Maeyama, Katsuyoshi and Jia, Yuanyuan and Taniguchi, Tadahiro},
title={Haptic-ACT - Pseudo Oocyte Manipulation by a Robot Using Multimodal Information and Action Chunking with Transformers},
year={2025, under review}
}
This work was supported by the Japan Science and Technology Agency (JST) Moonshot Research & Development Program, Grant Number JPMJMS2033.