Cobot Magic: Mobile Aloha system works on AgileX Robotics platform

Introduction

AgileX Cobot Magic is a system based on Mobile ALOHA that can simultaneously remotely control the TRACER mobile base and robotic arms.

浇花1

Story

Mobile Aloha is a whole-body remote operation data collection system developed by Zipeng Fu, Tony Z. Zhao, and Chelsea Finn from Stanford University. Its hardware is based on 2 robotic arms (ViperX 300), equipped with 2 wrist cameras and 1 top camera, and a mobile base from AgileX Robotics Tracer differential motion robot, etc. Data collected using Mobile ALOHA, combined with supervised behavior cloning and joint training with existing static ALOHA datasets, can improve the performance of mobile manipulation tasks. With 50 demonstrations for each task, joint training can increase the success rate by 90%. Mobile ALOHA can autonomously perform complex mobile manipulation tasks such as cooking and opening doors. Special thanks to the Stanford research team Zipeng Fu, Tony Z. Zhao, and Chelsea Finn for their research on Mobile ALOHA, which enabled full open-source implementation. For more details about this project please check the link.

Based on Mobile Aloha, AgileX developed Cobot Magic, which can achieve the complete code of Mobile Aloha, with higher configurations and lower costs, and is equipped with larger-load robotic arms and high-computing power industrial computers. For more details about Cobot Magic please check the AgileX website.

AgileX Cobot Magic is a system based on Mobile ALOHA that can simultaneously remotely control the TRACER mobile base and robotic arms. It is equipped with an indoor differential drive AGV base, a high-performance robotic arm, an industrial-grade computer, and other components. AgileX Cobot Magic assists users in better utilizing open-source hardware and the Mobile ALOHA deep learning framework for robotics. It covers a wide range of tasks, from simple pick-and-place operations to more intricate and complex actions such as pouring, cooking, riding elevators, and organizing items.

Currently, AgileX has successfully completed the integration of Cobot Magic based on the Mobile Aloha source code project. It currently includes the entire process of data collection, data re-display, data visualization, demonstration mode, model training, inference, and so on. This project will introduce AgileX Cobot Magic and provide ongoing updates on the training progress of mobile manipulation tasks.

Hardware configuration

Here is the list of hardware in AgileX Cobot Magic.

Component Item Name Model
Standard Configuration Wheeled Mobile Robot AgileX Tracer
Deep Camera x3 Orbbec Dabai
USB Hub 12V Power Supply,7-port
6 DOF Lightweight Robot Arm x4 Customized by AgileX
Adjustable Velcro x2 Customized by AgileX
Grip Tape x2 Customized by AgileX
Power Strip 4 Outlets, 1.8m
Mobile Power Station 1000W
ALOHA Stand Customized by AgileX
OptionalConfiguration Nano Development Kit Jetson Orin Nano Developer Kit (8G)
Industrial PC APQ-X7010/GPU 4060/i7-9700-32g-4T
IMU CH110
Display 11.6" 1080p

Note: An IPC is required. Users have two options: Nano Development kit and APQ-X7010 IPC.

Software configuration

Local computer:

Ubuntu20.04, cuda-11.3.

Environment configuration:

# 1. Create python virtual environment
conda create -n aloha python=3.8

# 2. Activate
conda activate aloha

# 3. Install cuda and torch
pip install torch==1.11.0+cu113 torchvision==0.12.0+cu113 torchaudio==0.11.0 --extra-index-url https://download.pytorch.org/whl/cu113  


# 4 Install detr
##  Get act code
git clone https://github.com/agilexrobotics/act-plus-plus.git
cd act-plus-plus


# 4.1 other dependencies
pip install -r requirements.txt

## 4.2 Install detr
cd detr && pip install -v -e .

Simulated environment datasets

You can find all scripted/human demos for simulated environments here. here

After downloading, copy it to the act-plus-plus/data directory. The directory structure is as follows:

act-plus-plus/data
    ├── sim_insertion_human
    │   ├── sim_insertion_human-20240110T054847Z-001.zip
        ├── ...
    ├── sim_insertion_scripted
    │   ├── sim_insertion_scripted-20240110T054854Z-001.zip
        ├── ... 
    ├── sim_transfer_cube_human
    │   ├── sim_transfer_cube_human-20240110T054900Z-001.zip
    │   ├── ...
    └── sim_transfer_cube_scripted
        ├── sim_transfer_cube_scripted-20240110T054901Z-001.zip
        ├── ...

Demonstration

By now it is widely accepted that learning a task from scratch, i.e., without any prior knowledge, is a daunting undertaking. Humans, however, rarely attempt to learn from scratch. They extract initial biases as well as strategies on how to approach a learning problem from instructions and/or demonstrations of other humans. This is what we call ‘programming by demonstration’ or ‘Imitation learning’.

The demonstration usually contains decision data {T1, T2,…, Tm}. Each decision contains the state and action sequence

image.png

Extract all “state-action pairs” and build a new set

image.png

Currently, based on AgileX Cobot Magic, we can achieve multiple whole-body action tasks.

Here we will show different action task demonstrations collected using AgileX Cobot Magic.

Watering flowers

浇花1

Opening a box

开箱子1

Pouring rice

倒米1

Twisting a bottle cap

拧瓶盖1

Throwing a rubbish

扔垃圾1

Using AgileX Cobot Magic, users can flexibly complete various action tasks in life by controlling the teaching robot arm from simple pick and place skills to more sophisticated skills such as twisting bottle caps. The mobile chassis provides more possibilities for the robotic arms so that the robotic arm is no longer restricted to performing actions in a fixed place. The 14 + 2 DOFs provide limitless potential for collecting diverse data.

Data Presentation

Display the collected data of a certain demonstration of the AgileX Cobot Magic arms. The collected data includes the positional information of 14 joints at different time intervals.

Summary

Cobot Magic is a remote whole-body data collection device, developed by AgileX Robotics based on the Mobile Aloha project from Stanford University. With Cobot Magic, AgileX Robotics has successfully achieved the open-source code from the Stanford laboratory used on the Mobile Aloha platform. Data collection is no longer limited to desktops or specific surfaces thanks to the mobile base Tracer on the Cobot Magic, which enhances the richness and diversity of collected data.

AgileX will continue to collect data from various motion tasks based on Cobot Magic for model training and inference. Please stay tuned for updates on Github.

About AgileX

Established in 2016, AgileX Robotics is a leading manufacturer of mobile robot platforms and a provider of unmanned system solutions. The company specializes in independently developed multi-mode wheeled and tracked wire-controlled chassis technology and has obtained multiple international certifications. AgileX Robotics offers users self-developed innovative application solutions such as autonomous driving, mobile grasping, and navigation positioning, helping users in various industries achieve automation. Additionally, AgileX Robotics has introduced research and education software and hardware products related to machine learning, embodied intelligence, and visual algorithms. The company works closely with research and educational institutions to promote robotics technology teaching and innovation.

Appendix

ros_astra_camera configuration

ros_astra_camera-githubros_astra_camera-gitee

Camera Parameters

Name Parameters
Baseline 40mm
Depth distance 0.3-3m
Depth map resolution 640x400x30fps、320x200x30fps
Color image resolution 1920x1080x30fps、1280x720x30fps、640x480x30fps
Accuracy 6mm@1m (81% FOV area in accuracy calculation)
Depth FOV H 67.9° V 45.3°
Color FOV H 71° V 43.7° @ 1920x1080
Delay 30-45ms
Data transmission USB2.0 or above
Working temperature 10°C~40°C
Size Length 59.5x Width 17.4x Thickness 11.1 mm
  1. OrbbecSDK_ROS1Drive installation
# 1 Install dependencies
sudo apt install libgflags-dev  ros-$ROS_DISTRO-image-geometry ros-$ROS_DISTRO-camera-info-manager ros-$ROS_DISTRO-image-transport ros-$ROS_DISTRO-image-publisher ros-$ROS_DISTRO-libuvc-ros libgoogle-glog-dev libusb-1.0-0-dev libeigen3-dev 

# 2 Download the code
## 2.1 github
git clone https://github.com/orbbec/OrbbecSDK_ROS1.git astra_ws/src
## 2.2 gitee(Chinese region)
git clone https://gitee.com/orbbecdeveloper/OrbbecSDK_ROS1 -b v1.4.6 astra_ws/src

# 3 Compile orbbec_camera
## 3.1 Enter astra_ws workspace
cd astra_ws
## 3.2 Compile orbbec_camera
catkin_make

# 4 Install udev rules.
source devel/setup.bash && rospack list
roscd orbbec_camera/scripts
sudo cp 99-obsensor-libusb.rules /etc/udev/rules.d/99-obsensor-libusb.rules
sudo udevadm control --reload && sudo  udevadm trigger

# 5 Add ros_astra_camera package environment variables
## 5.1 Enter astra_ws
cd astra_ws
## 5.2 Add environment variables
echo "source $(pwd)/devel/setup.bash" >> ~/.bashrc 
## 5.3 Environment variables work

# 6 Launch
## If step 5 is not performed, the code in 6.2 needs to be executed every time it is started to make the ros workspace environment take effect.
## 6.1 astra_ws
cd astra_ws
## 6.2 workspace works
source devel/setup.bash
## 6.3 launch astra.launch
roslaunch orbbec_camera astra.launch
## 6.4 luanch dabai.launch
roslaunch orbbec_camera dabai.launch
  1. Configure orbbec_camera multiple camera nodes

① Check the device serial number

● After installing the camera, run the following code

rosrun orbbec_camera list_devices_node | grep -i serial

● The output in the terminal

[ INFO] [1709728787.207920484]: serial: AU1P32201SA
# Please recored this serial number. Each camera corresponds to a unique Serial number.

② Configure multiple camera nodes

● cobot_magic uses three Dabai cameras of orbbec_camera, so it is necessary to configure the corresponding camera according to the Serial number of each camera.

● Industrial computer PC plugs in the USB data cables of the three cameras and runs 1. View the code in the device number section to view the Serial numbers of the three cameras

● In order to clarify the topics corresponding to each camera in subsequent development, please fill in the Serial number in order.

● Create the multi_dabai.launch file in the astra_ws/src/launch directory with the following content:

# Mainly modify: 1 Camera name 、2 Serial number
<launch>
    <arg name="camera_name" default="camera"/>
    <arg name="3d_sensor" default="dabai"/>
    
     <!-- 1 Mainly modify 1 camera name prefix and 2 Serial number. -->
    <arg name="camera1_prefix" default="01"/>
    <arg name="camera2_prefix" default="02"/>
    <arg name="camera3_prefix" default="03"/>
    
    <!-- # 2 Serial number : Fill in the camera Serial number -->
    <arg name="camera1_usb_port" default="camera1的serial number"/>
    <arg name="camera2_usb_port" default="camera2的serial number"/>
    <arg name="camera3_usb_port" default="camera3的serial number"/>
 
    <arg name="device_num" default="3"/>
    <include file="$(find orbbec_camera)/launch/$(arg 3d_sensor).launch">
        <arg name="camera_name" value="$(arg camera_name)_$(arg camera1_prefix)"/>
        <arg name="usb_port" value="$(arg camera1_usb_port)"/>
        <arg name="device_num" value="$(arg device_num)"/>
    </include>
 
    <include file="$(find orbbec_camera)/launch/$(arg 3d_sensor).launch">
        <arg name="camera_name" value="$(arg camera_name)_$(arg camera2_prefix)"/>
        <arg name="usb_port" value="$(arg camera2_usb_port)"/>
        <arg name="device_num" value="$(arg device_num)"/>
    </include>
    
    <include file="$(find orbbec_camera)/launch/$(arg 3d_sensor).launch">
        <arg name="camera_name" value="$(arg camera_name)_$(arg camera3_prefix)"/>
        <arg name="usb_port" value="$(arg camera3_usb_port)"/>
        <arg name="device_num" value="$(arg device_num)"/>
    </include>
</launch>

● Add permissions

# 1 Enter astra_camera/launch/
roscd orbbec_camera/launch/
 
# 2 multi_dabai.launch add permissions
chmod +x multi_dabai.launch

● Launch ros

roslaunch orbbec_camera multi_dabai.launch

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.