Taccel: Scaling Up Vision-based Tactile Robotics via High-performance GPU Simulation

Yuyang Li1,2 *, Wenxin Du3 *, Chang Yu3 *, Puhao Li2, Zihang Zhao1, Tengyu Liu2, Chenfanfu Jiang3 †, Yixin Zhu1 †, Siyuan Huang2 †

1 Institute for AI, PKU 2 State Key Lab of General AI, BIGAI 3 AIVC Lab, UCLA

* Equal Contributor Corresponding Author

[📄 Paper] [📘 Docs] [🛠️ Code] [📄 Dataset (Coming soon)]

Taccel is a high-performance simulation platform for vision-based tactile sensors and robots.

Taccel Infrastructure

Taccel integrates IPC and ABD to model robots, tactile sensors, and objects with both accuracy and unprecedented speed, simulating precise physics, realistic tactile signals, and flexible robot-sensor configurations through user-friendly APIs.

Examples

Our release comes with several examples. See Taccel Examples for more details.

Contributing

We welcome contributions from the community to improve Taccel and make it even more useful for everyone. You can pull requests, report issues, or suggest features via our GitHub repo.

For more details and instructions, please follow the contributing guide.

Citing

If you use Taccel in your research, please use the following citation:

@misc{li2025taccel,
   title={Taccel: Scaling Up Vision-based Tactile Robotics via High-performance GPU Simulation},
   author={Li, Yuyang and Du, Wenxin and Yu, Chang and Li, Puhao and Zhao, Zihang and Liu, Tengyu and Jiang, Chenfanfu and Zhu, Yixin and Huang, Siyuan},
   journal={arXiv preprint arXiv:2504.12908},
   year={2025}
}

Index