Data Collection

Recording dexterous manipulation data with the Orca Hand is fundamentally different from simple gripper data collection — 17 finger joints, optional tactile streams, and glove teleoperation all require careful synchronization. This guide covers the full workflow.

Before Recording

Hardware Setup for Dexterous Data Collection

The Orca Hand recording setup has more streams than a simple gripper arm — finger joints, optional tactile, cameras, and arm joints all need to be synchronized.

🕐

Orca Hand (USB serial)

17-DOF finger state at up to 100 Hz. Verify: python -c "from orca_core import OrcaHand; h=OrcaHand('/dev/ttyUSB0'); h.connect(); print(h.get_positions())"

🤸

Teleoperation Device

Juqiao Glove (recommended) or VR hand tracking. Maps operator finger poses to Orca Hand joint targets in real time.

📷

Palm / Wrist Camera

Small USB camera inside the palm pointing at the fingertips and object. Critical for contact-rich tasks where fingertip-object contact determines grasp success.

📈

Tactile Sensors (optional)

Paxini or compatible fingertip sensors. Separate USB connection. Add contact force stream to your dataset for contact-guided policy training.

Recording Workflow

Step-by-Step Recording Workflow

1

Bring up arm + hand + glove

# Terminal 1: Arm (if using OpenArm)
source ~/openarm_ws/install/setup.bash
ros2 launch openarm_ros2 openarm.launch.py use_fake_hardware:=false can_interface:=can0

# Terminal 2: Orca Hand
source ~/orca_ws/install/setup.bash
ros2 launch orca_ros2 orca_hand.launch.py port:=/dev/ttyUSB0 handedness:=right

# Terminal 3: Juqiao Glove (see Juqiao Glove software page)
python -m juqiao_glove.stream --port /dev/ttyUSB1
2

Verify all streams are live

# Check finger joint states
ros2 topic hz /orca_hand/joint_states   # expect ~100 Hz

# Check glove stream
ros2 topic hz /juqiao_glove/finger_angles   # expect ~100 Hz

# Check camera
python -c "import cv2; cap=cv2.VideoCapture(0); print('Camera OK:', cap.isOpened())"
3

Calibrate glove-to-hand mapping

Run the glove calibration to map your finger dimensions to the Orca Hand's joint range. This step ensures natural teleoperation — your open hand = Orca open, your fist = Orca fist.

python -m orca_core.scripts.calibrate_teleop \
  --hand_port /dev/ttyUSB0 \
  --glove_port /dev/ttyUSB1 \
  --handedness right
4

Set up the task scene

Place objects in consistent starting positions. Dexterous grasping is highly sensitive to object pose — use fixturing (tape, putty) to maintain consistent object placement across episodes.

5

Start recording session

python -m orca_core.scripts.record_episodes \
  --hand_port /dev/ttyUSB0 \
  --glove_port /dev/ttyUSB1 \
  --camera_id 0 \
  --fps 30 \
  --output_dir ~/datasets/orca-grasp-v1 \
  --num_episodes 50 \
  --task "Pick up the pen using a precision pinch grasp"

The recorder saves synchronized finger joint positions, glove angles, camera frames, and (if connected) tactile readings to a Parquet dataset.

6

Review grasp quality

Replay each episode. Look for: slippage events (sudden position jumps), incomplete grasps, tendon saturation (joint at hard limit during grasp), inconsistent approach paths.

python -m orca_core.scripts.visualize_episode \
  --dataset_dir ~/datasets/orca-grasp-v1 \
  --episode_index 0
7

Push to HuggingFace Hub

python -m orca_core.scripts.push_dataset \
  --dataset_dir ~/datasets/orca-grasp-v1 \
  --repo_id your-username/orca-grasp-v1
Dataset Format

Orca Hand Dataset Schema

The Orca Hand dataset format extends the standard LeRobot schema with additional finger-specific and tactile streams.

Fields in each episode Parquet file
observation.hand_state float32[17] All 17 finger joint positions in degrees
observation.hand_velocity float32[17] Finger joint velocities in deg/s
observation.tactile float32[5] Contact force per fingertip in N (if tactile sensors connected)
observation.images.* video path Camera frames — palm view, wrist view, workspace view
action float32[17] Target finger joint positions from glove teleoperation
grasp_type string Label for the grasp primitive used (e.g., "precision_pinch", "power_grasp")
timestamp float64 Unix timestamp in seconds
next.done bool True on the last frame of each episode
Quality Assurance

Dexterous Data Quality Checklist

Dexterous manipulation data has more failure modes than simple pick-and-place. Run through this before pushing to the Hub.

  • 1
    All 17 joints have valid position readings throughout the episode NaN or stuck values indicate a servo dropout. Inspect observation.hand_state for constant values in any joint across all frames.
  • 2
    Glove-to-hand latency is below 20ms Check timestamp alignment between action (from glove) and observation.hand_state. High latency causes the policy to learn from causally inconsistent pairs.
  • 3
    Grasp type is consistent within a task A policy for precision pinch will not generalize if some demonstrations used power grasps. Keep each dataset to one primary grasp strategy, or label by grasp type.
  • 4
    Contact events are visible in joint positions When the hand makes contact with an object, finger joints should show a clear deceleration and compliance deflection. Episodes where fingers snap to position without contact deformation may have missed the grasp.
  • 5
    Palm camera shows object during entire manipulation phase The object must be visible in the palm camera from approach to release. Check that the camera mount was not bumped and that the FOV covers the task workspace.
  • 6
    Tactile streams have contact events where expected During a pinch grasp, both thumb and index tactile should show force increase. Missing force signals during confirmed contact indicate a disconnected or miscalibrated sensor.

Dataset Ready? Start Training.

Push your dexterous manipulation dataset to HuggingFace and explore compatible models.