Implementation:Isaac sim IsaacGymEnvs MotionLib
| Knowledge Sources | |
|---|---|
| Domains | Motion_Capture, Character_Animation |
| Last Updated | 2026-02-15 11:00 GMT |
Overview
MotionLib manages a library of motion capture clips for AMP training, providing efficient sampling, time-based interpolation, and extraction of per-frame reference states (root pose, joint DOFs, key body positions) needed by the discriminator and environment reset logic.
Description
The MotionLib class in isaacgymenvs/tasks/amp/utils_amp/motion_lib.py serves as the central motion data manager for the AMP training pipeline. It loads one or more motion capture files (NPZ format via SkeletonMotion.from_file()) and provides methods to sample motions, query arbitrary time points, and extract the full kinematic state needed for computing AMP observations and resetting environments to reference poses.
The _load_motions() method reads a motion specification file (YAML) that lists motion clip paths and optional weights, then loads each clip as a SkeletonMotion object. It pre-computes per-clip metadata including frame count, duration, FPS, and per-frame DOF velocities. Motion weights are normalized to form a probability distribution for sampling.
The key query method get_motion_state() takes arrays of motion IDs and time values, and returns interpolated kinematic state by blending between adjacent frames. It computes frame indices and blend factors via _calc_frame_blend(), then performs linear interpolation for positions and velocities and spherical linear interpolation (slerp) for rotations. The returned state includes root position, root rotation, DOF positions, root velocity, root angular velocity, DOF velocities, and key body positions -- all converted to GPU tensors on the configured device. The sample_motions() and sample_time() methods provide weighted random sampling for selecting reference frames during training.
Usage
Use MotionLib in AMP humanoid tasks to provide reference motion data for the discriminator's demonstration buffer and for initializing environments with poses from motion capture clips. It is instantiated by HumanoidAMP and its variants during task setup.
Code Reference
Source Location
- Repository: IsaacGymEnvs
- File: isaacgymenvs/tasks/amp/utils_amp/motion_lib.py
- Lines: 1-323
Signature
class MotionLib:
def __init__(self, motion_file, num_dofs, key_body_ids, device):
"""Load motion clips from file, compute metadata and DOF velocities."""
def num_motions(self) -> int:
"""Return the number of loaded motion clips."""
def get_total_length(self) -> float:
"""Return sum of all motion clip durations."""
def get_motion(self, motion_id):
"""Return the raw SkeletonMotion for a given clip index."""
def sample_motions(self, n) -> np.ndarray:
"""Randomly sample n motion IDs weighted by clip lengths."""
def sample_time(self, motion_ids, truncate_time=None) -> np.ndarray:
"""Sample random time points within each motion's duration."""
def get_motion_length(self, motion_ids) -> np.ndarray:
"""Return duration of specified motion clips."""
def get_motion_state(self, motion_ids, motion_times):
"""Interpolate full kinematic state at arbitrary time points.
Returns: root_pos, root_rot, dof_pos, root_vel, root_ang_vel, dof_vel, key_pos"""
def _load_motions(self, motion_file):
"""Load motion clips from YAML spec, compute fps/dt/num_frames/dof_vels."""
def _calc_frame_blend(self, motion_times, motion_len, num_frames, dt):
"""Compute frame indices and blend factor for time-based interpolation."""
Import
from isaacgymenvs.tasks.amp.utils_amp.motion_lib import MotionLib
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| motion_file | str | Yes | Path to YAML file listing motion clip paths and optional weights |
| num_dofs | int | Yes | Number of DOFs in the character model (e.g., 28 for AMP humanoid) |
| key_body_ids | Tensor | Yes | Indices of key bodies (hands, feet) for tracking positions |
| device | str | Yes | Torch device for tensor allocation (e.g., "cuda:0") |
| motion_ids | np.ndarray | Yes | Array of motion clip indices to query |
| motion_times | np.ndarray | Yes | Array of time values (seconds) within each motion clip |
| truncate_time | float | No | Optional time to subtract from motion duration when sampling |
Outputs
| Name | Type | Description |
|---|---|---|
| root_pos | Tensor(n, 3) | Interpolated root positions in world space |
| root_rot | Tensor(n, 4) | Interpolated root rotations as quaternions (x, y, z, w) |
| dof_pos | Tensor(n, num_dofs) | Interpolated DOF positions (joint angles) |
| root_vel | Tensor(n, 3) | Root linear velocities |
| root_ang_vel | Tensor(n, 3) | Root angular velocities |
| dof_vel | Tensor(n, num_dofs) | DOF velocities (joint angular velocities) |
| key_pos | Tensor(n, num_key_bodies, 3) | Interpolated positions of key bodies (hands, feet) |
Usage Examples
import numpy as np
import torch
from isaacgymenvs.tasks.amp.utils_amp.motion_lib import MotionLib
# Initialize the motion library
device = "cuda:0"
num_dofs = 28
key_body_ids = torch.tensor([3, 7, 11, 14], device=device) # hands and feet
motion_lib = MotionLib(
motion_file="assets/amp/motions/humanoid_motions.yaml",
num_dofs=num_dofs,
key_body_ids=key_body_ids,
device=device
)
print(f"Loaded {motion_lib.num_motions()} motions")
print(f"Total duration: {motion_lib.get_total_length():.1f}s")
# Sample random motion clips and time points
num_samples = 512
motion_ids = motion_lib.sample_motions(num_samples)
motion_times = motion_lib.sample_time(motion_ids)
# Get interpolated reference state
root_pos, root_rot, dof_pos, root_vel, root_ang_vel, dof_vel, key_pos = \
motion_lib.get_motion_state(motion_ids, motion_times)
# Use for AMP demo observations or environment reset
# root_pos.shape: (512, 3)
# dof_pos.shape: (512, 28)
# key_pos.shape: (512, 4, 3)