AgileMaster - Humanoid Teleoperation Solution

Robot-Ready Demonstration Data for Cobot & Humanoid VLA Training

Built for teleoperation and demonstration data collection, the j-mex IMU MoCap Platform delivers structured, high-fidelity human motion data for Isaac GR00T workflows and VLA training pipelines.

quotation marks top

Unlock the Potential of Embodied AI and Humanoid Robots

j-mex AgileMaster - humanoid teleoperation solution combines full-body motion capture with parameterized robot joint mapping, offering a streamlined solution to rapidly enable robot teleoperation, control simulation, bionic control, VLA data augmentation, and digital twin.

j-mex is committed to working with our partners to shape the next generation of human-robot collaboration and unlock the limitless potential of Embodied AI.

quotation marks bottom

Rapidly Collect High-Quality Human Demonstration Data

High-quality human demonstration data is the cornerstone of Embodied AI and Imitation Learning. j-mex AgileMaster is purpose-built to solve the three major pain points of traditional methods: data quality, conversion efficiency, and development speed.

banner 02

Real-Time High-Fidelity Data Source

The Problem:
Most traditional mocap systems only track end-effector poses, resulting in incomplete data that falls short for complex Embodied AI tasks.
j-mex Data Advantage
j-mex provides Full-body multi-joint tracking, capturing precise and complete motion data. This ensures your demonstration data has extremely high fidelity, providing a solid foundation for advanced AI training.
banner 03

Parameterized Joint Structure Mapping

The Problem:
There’s a significant physical and structural gap between human motion data (Avatar Motion) and robot motion control (Robot Motion). Traditional workflows consume extensive time on manual conversion or custom development.
j-mex Solution
Our proprietary Human-to-Robot Mapping algorithm solves the complex conversion challenge from freely-moving human joints to physically-constrained robot joints. Especially for arm control, you can quickly switch between end-effector mode and Humanoid arm mode, with coordinates and motion data output directly via ROS2 protocol — ready for robot control.
banner 04

Simplified Cobot & Manipulator Setup

The Problem:
Traditional setup requires programming or graphical editing, followed by observing robot arm test runs, then iterating on adjustments and path optimization.
j-mex Efficiency
By combining Teleoperation with simulation control in Isaac Sim, you can rapidly converge on optimal operation paths, then convert them into automated routines — dramatically simplifying the complexity of robot arm automation setup.

Three Core Capabilities to Accelerate Development

To drive the rapid advancement of robotics and Embodied AI, j-mex AgileMaster focuses on three key capabilities:

End-to-End Teleoperation
Delivers stable, high-quality operational data with low-latency, smooth remote control. Fully supports real-time control and complex human-robot interaction across multiple robot platforms. Out-of-the-box support for various Cobot Arms and Humanoids, with custom integration services available.
Motion Retargeting

Our proprietary Motion Retargeting algorithm converts human skeletal data to multi-axis robot Joint Control signals in real time with high precision. Built-in Skeleton Topology Switching lets users instantly switch between Humanoid and Cobot Arm modes with a single click.
Omniverse Ecosystem Integration

Supports industry standards including URDF and ROS2. Deep integration with Omniverse, Isaac Sim, and Isaac Lab — easily plug into GR00T Mimic pipeline or even GR00T VLA training flow.

j-mex AgileMaster
Humanoid Teleoperation Solution Operational Architecture

Architecture 0204

Powering Diverse Robot Applications

banner 08
Teleoperation
Ideal for hazardous environments and high-precision tasks — ensuring operator safety and accurate mission execution.
banner 01
Embodied AI Training
Rapidly generate high-quality, high-fidelity demonstration data for VLA (Vision-Language-Action) training and Imitation Learning.
banner 09
Industrial Digital Twin
Combine Teleoperation with Isaac Sim/Lab to synchronize real robot states in a virtual environment in real time — enabling digital twin monitoring, operation validation, and workflow optimization for production lines, while reducing physical testing risks and costs.
banner 05
Learning from Demonstration
Enable manipulators or collaborative robot arms (Cobots) to learn intuitively from direct human motion demonstrations — dramatically simplifying programming complexity.

From Capture to Autonomous Execution: 4 Steps to Train Your Robot

Four core steps to train robots using j-mex AgileMaster:

num 1

Motion Data Capture

A human operator wearing the V100-R suit controls the robot in real time, capturing precise motion data.
num 2

Sensor Data Collection

The robot collects sensor data during teleoperation, enhancing its understanding of motion and the environment.
num 3

Policy Learning

Motion and sensor data are combined for deep learning, refining the robot’s movement patterns.
num 4

Autonomous Execution

The robot autonomously performs learned actions under various conditions, powered by its trained policy.

Core Hardware: MoCap Suit V100-R Plus

V100 R Plus Photo

High-Performance Motion Capture for Advanced Robotics

The V100-R Plus is a professional-grade motion capture suit specifically engineered for high-end robotic motion training and Vision-Language-Action (VLA) collaboration. Built on full-body IMU motion capture and human movement science, it transforms complex human dynamics into smooth, kinematics-based robot joint control.
  • Professional Precision: Features 15 nine-axis IMU sensors with integrated high-precision finger capture for detailed motion mapping.
  • Real-Time Performance: Delivers a 100 Hz data output rate, ensuring millisecond-level synchronization for teleoperation and imitation learning.
  • Industrial Stability: Designed with an advanced transponder architecture to provide stable, low-latency data streaming in demanding environments.
  • Seamless Integration: Native support for ROS2 (Humble and Jazzy), NVIDIA Isaac Sim, and URDF importing for rapid deployment across humanoid and cobot platforms.

FAQ

Traditional mocap systems output human skeletal data. j-mex AgileMaster includes our proprietary Motion Retargeting algorithm that directly outputs robot-ready joint signals (orientation and motion) via ROS2 protocol — no extra conversion tools required.

Out-of-the-box support for several mainstream Humanoid robots and Cobot Arms, with built-in skeleton topology switching. You can also import custom Robot URDFs.

Yes! j-mex is an NVIDIA Omniverse Ready Partner. j-mex Teleoperation Starter Kit can stream motion data directly to Isaac Sim/Lab — not only for control simulation, but also for integration with GR00T Mimic pipeline or VLA training flow.

Ready to Unlock the Potential of Humanoid Robots and Embodied AI?

Contact us today to experience the millisecond-level seamless conversion from human motion to robot execution firsthand, and accelerate your robotics and AI projects.
Name
Scroll to Top