Humanoids

Guide humanoids through complex real-world tasks.

Meadow landscape background

Features

Built for humanoid teleoperation at scale.

Full-body mapping & motion capture

Map operator movements to a humanoid's full kinematic chain. Walk, reach, turn, and crouch with intuitive 1:1 motion control via VR headsets or exoskeletons.

Dexterous hand teleoperation

Precisely control individual fingers and grip force. Pick up fragile objects, turn door handles, or manipulate tools — tasks that demand human-level dexterity.

Stereoscopic VR streaming

Operators see through the robot's eyes with stereoscopic depth perception streamed at as low as 40ms latency. No motion sickness, no perceptual delay.

Force-feedback & haptics

Bidirectional haptic streams transmit contact forces, textures, and resistance from the robot's sensors back to the operator's hands, enabling delicate manipulation tasks.

Low-latency custom protocol

Our custom streaming protocol — not WebRTC — delivers as low as 40ms glass-to-glass latency. Critical for the tight control loop humanoid teleoperation demands.

Imitation Learning Data Pipeline

Every teleoperated motion is recorded as synchronized joint-angle trajectories, force readings, and visual data — ready for your imitation learning and behaviour cloning workflows.

Applications

Where humanoid teleoperation matters

Manufacturing & Assembly

Handle complex assembly steps, quality inspection, and rework tasks that require human judgment — without being physically present on the factory floor.

Healthcare & Assisted Living

Provide the dexterity and judgment needed for tasks like opening doors, fetching objects, and assisting patients — safely and in real time.

Data Collection for Autonomy

Generate the high-fidelity demonstration data your autonomy stack needs. Every human-guided motion becomes a training example for your models.

The teleoperation engine for humanoid robots