SynAgent: Generalizable Cooperative Humanoid Manipulation via Solo-to-Cooperative Agent Synergy

Wei Yao1*, Haohan Ma2*, Hongwen Zhang3, Yunlian Sun1, Liangjun Xing4, Zhile Yang2, Yuanjun Guo2, Yebin Liu4, Jinhui Tang5
* Equal Contribution (Co-first Authors)
1Nanjing University of Science and Technology 2Shenzhen Institutes of Advanced Technology, CAS 3Beijing Normal University 4Tsinghua University 5Nanjing Forestry University

SynAgent generalizes across diverse object geometries and supports cooperative manipulation via Solo-to-Cooperative Agent Synergy.

Abstract

Controllable cooperative humanoid manipulation is a fundamental yet challenging problem for embodied intelligence, due to severe data scarcity, complexities in multi-agent coordination, and limited generalization across objects. In this paper, we present SynAgent, a unified framework that enables scalable and physically plausible cooperative manipulation by leveraging Solo-to-Cooperative Agent Synergy to transfer skills from single-agent human-object interaction to multi-agent human-object-human scenarios. To maintain semantic integrity during motion transfer, we introduce an interaction-preserving retargeting method based on an Interact Mesh constructed via Delaunay tetrahedralization, which faithfully maintains spatial relationships among humans and objects. Building upon this refined data, we propose a single-agent pretraining and adaptation paradigm that bootstraps synergistic collaborative behaviors from abundant single-human data through decentralized training and multi-agent PPO. Finally, we develop a trajectory-conditioned generative policy using a conditional VAE, trained via multi-teacher distillation from motion imitation priors to achieve stable and controllable object-level trajectory execution. Extensive experiments demonstrate that SynAgent significantly outperforms existing baselines in both cooperative imitation and trajectory-conditioned control, while generalizing across diverse object geometries. Codes and data will be available after publication.


Method Overview

Demo Video

BibTeX


Coming soon.