Quickstart: Before You Begin

Read this first. This page tells you exactly what hardware you need, what software to install, how long everything takes, and what you will be able to do when you finish the full path.

How Long Does This Take?

From box to recording your first training episode.

Hardware setup
~2h
Unboxing, mounting, CAN wiring, safety check
Software install
~1.5h
Ubuntu, ROS2, SDK, CAN drivers
First motion
~30m
Fake hardware mode, then real hardware bringup
Calibration
~45m
Joint homing, workspace verification
Teleoperation
~1h
Operator device setup, first recording session
First dataset
~1h
LeRobot format, quality checks, upload

Total first-day time: roughly 6–7 hours for a complete first session. Spread across two days is comfortable for beginners.

Hardware Checklist

Everything that needs to be on your desk before you start. Items marked "in the box" ship with OpenArm 101.

  • OpenArm 101 arm unit In the box — 8-DOF arm, ~3 kg, aluminum & carbon fiber. Buy from store →
  • 24V DC power supply (150W min) In the box — included with OpenArm 101.
  • CAN USB adapter (CANable 2.0 or equivalent) In the box — connects your host PC to the arm's CAN bus.
  • Mounting plate + hardware In the box — M6 bolts, base plate. You need a stable surface or workbench to bolt to.
  • Host PC running Ubuntu 22.04 Not included — must be a native Ubuntu install (not VM or WSL) for reliable SocketCAN timing. Minimum: 8 GB RAM, 50 GB disk. Download Ubuntu →
  • USB-A to USB-B cable (for CAN adapter) Not included — standard USB cable, ~1 m. Most electronics retailers.
  • Camera (for data collection) Optional for setup, required for training data — USB webcam or RealSense D435i recommended. View accessories →
  • 1 m × 1 m clear workspace around arm base Required — the arm's workspace envelope is 650 mm radius. Clear area of people and obstacles before powering on. See Safety page →

What to Install Before You Start

The Setup Guide walks through each step in detail. This is the summary so you can prepare in advance.

Operating System

Ubuntu 22.04 LTS (Jammy). Ubuntu 20.04 also works but 22.04 is recommended. macOS and Windows are not supported for real hardware — SocketCAN requires a Linux kernel. You must use a native install, not Docker or WSL2, for reliable CAN timing.

Python

Python 3.10 (bundled with Ubuntu 22.04). The SDK targets Python 3.10+. Install pip and venv before starting:

sudo apt update
sudo apt install python3-pip python3-venv -y

ROS2

ROS2 Humble Hawksbill (Ubuntu 22.04) or ROS2 Iron Irwini (Ubuntu 22.04/23.04). Humble is the LTS version and is recommended for stability. Full installation takes ~15 minutes:

# Add ROS2 apt repo (abbreviated — full steps in setup guide)
sudo apt install ros-humble-desktop ros-humble-ros2-control \
  ros-humble-ros2-controllers -y

SocketCAN Drivers

Built into the Linux kernel. You only need to load the kernel modules and configure your CAN interface:

sudo modprobe can
sudo modprobe can_raw
sudo modprobe slcan   # for USB CAN adapters

OpenArm SDK

pip install roboticscenter

Installs openarm_can and all Python dependencies. See the Software page for full SDK setup and the SDK Quickstart in the wiki.

LeRobot (for data collection)

pip install lerobot

Required for recording and converting episodes to the LeRobot / HuggingFace dataset format. See the Data Collection page for the full workflow.

Simulation (optional — no hardware needed)

✓ Simulation available

You can run the entire software stack without hardware using ROS2 fake hardware mode:

ros2 launch openarm_ros2 openarm.launch.py \
  use_fake_hardware:=true

MuJoCo and Isaac Sim are also supported for physics simulation and synthetic data generation. See the Software → Simulation section for setup.

What You Can Do After the Full Path

After completing all 7 steps — unboxing through first training dataset — you will be able to:

Control all 8 joints in real time over SocketCAN using MIT position / velocity / torque modes
Run ROS2 Humble with the openarm_ros2 package for trajectory planning and state publishing
Teleop the arm with an operator device and record synchronized joint state + camera episodes
Export datasets in LeRobot / HuggingFace format, ready for ACT and Diffusion Policy training
Run the same policy in MuJoCo simulation before deploying to real hardware
Publish datasets to the SVRC dataset registry and train shared models

Ready? Start the Setup Guide.

Once you have your hardware and Ubuntu installed, the Setup Guide walks you through every step.