Taccel: Scaling Up Vision-based Tactile Robotics via High-performance GPU Simulation¶
Yuyang Li 1,3,4,5,*, Wenxin Du 2,*, Chang Yu 2,*, Puhao Li 3,4, Zihang Zhao 1,4,5, Tengyu Liu 3,4, Chenfanfu Jiang 2,†, Yixin Zhu 1,4,5,6,†, Siyuan Huang 3,4,†
1 Peking University
2 University of California, Los Angeles
3 Beijing Institute for General AI
4 State Key Lab of General AI
5 Beijing Key Laboratory of Behavior and Mental Health, Peking University
6 Embodied Intelligence Lab, PKU-Wuhan Institute for Artificial Intelligence
* Equal Contributor † Corresponding Author
Taccel is a high-performance simulation platform for vision-based tactile sensors and robots.

Taccel integrates IPC and ABD to model robots, tactile sensors, and objects with both accuracy and unprecedented speed, simulating precise physics, realistic tactile signals, and flexible robot-sensor configurations through user-friendly APIs.
Examples¶
Our release comes with several examples. See Taccel Examples for more details.
Contributing¶
We welcome contributions from the community to improve Taccel and make it even more useful for everyone. You can pull requests, report issues, or suggest features via our GitHub repo.
For more details and instructions, please follow the contributing guide.
Citing¶
If you use Taccel in your research, please use the following citation:
@misc{li2025taccel,
title={Taccel: Scaling Up Vision-based Tactile Robotics via High-performance GPU Simulation},
author={Li, Yuyang and Du, Wenxin and Yu, Chang and Li, Puhao and Zhao, Zihang and Liu, Tengyu and Jiang, Chenfanfu and Zhu, Yixin and Huang, Siyuan},
journal={arXiv preprint arXiv:2504.12908},
year={2025}
}
Index¶
Get Started
Warp-IPC References
Taccel References
Development