Skip to content

This work describes an immersive control system for the Spot robot by Boston Dynamics, designed to track the head movements of the operator wearing the Meta Quest 2, while also utilizing touch controller commands for locomotion control.

License

Notifications You must be signed in to change notification settings

aliy98/zed-oculus-spot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Immersive control of a quadruped robot with Virtual Reality Eye-wear

This work describes an immersive control system for the Spot robot by Boston Dynamics, designed to track the head movements of the operator wearing the Meta Quest 2, while also utilizing joystick commands for locomotion control.

Authors:

©2024 RICE Lab - DIBRIS, University of Genova

Package Description

This repository provides a modified version of the software available on zed-oculus. Since the IMU and touch input data is required for this work, the main.cpp is modified in such a way that it reads the angular velocities, and touch input data using ts.HeadPose.AngularVelocity, InputState.Thumbstick[ovrHand_Right], and InputState.Thumbstick[ovrHand_Left] class attributes, and sends them to the process executed by main.py. Additionally, considering the fact that the ZED camera is not connected to the user PC with a USB cable, another modification is done in main.cpp file, in order to open the ZED camera from the socket input, by changing the init_paramters values in zed.open(init_parameters) same as the method HERE.

Furthermore, the scripts folder is added to this package, which contains the sofware developed for the headtracking task, and locomotion with joysticks. The script files are described as follows:

Script Description
main.py Gets executed by the main.cpp file. Uses the SpotInterface and Controller classes methods for the headtracking and locomotion tasks.
controller.py Provides a simple closed loop controller using the simple-pid python module with the method get_hmd_controls(setpoints). Additionally, computes the locomotion control signal based on the touch input reference signals with the method get_touch_controls(setpoints).
spot_interface.py Initializes the Lease, eStop, Power, RobotState, and RobotCommand clients. Provides the required method for sending the control signals to the robot set_controls(controls, dt), and receving robot angular velocities get_body_vel()

System Architecture

The system architecture for this work is shown as it follows:

A Wi-Fi bridge could be implemented on the Raspberry Pi board using this tutorial. Once it is ready, the components of the local network could be configured with the following ip addresses:

Component IP Address
Raspberry Pi 4 Model B 192.168.220.1 (Server - Ethernet and Wireless)
Jetson Nano 192.168.220.50 (Client - Ethernet)
OMEN PC 10.42.0.210 (Client - Wireless)
Spot Robot 10.42.0.211 (Client - Wireless)

Interprocess Communication

For the purpose of this work, the IMU data generated by the HMD, is transmitted to the main.py control process, using a named pipe. Moreover, the stereo camera images is transmitted from the robot to the HMD, using a socket, with the method of this tutorial. Sending the control signals and receiving the robot state data is done using gRPC.

Dependencie

Build

Download the sample and follow the instructions below: More

  1. Create a folder called "build" in the source folder
  2. Open cmake-gui and select the source and build folders
  3. Generate the Visual Studio Win64 solution
  4. Open the resulting solution and change configuration to Release. You may have to modify the path of the dependencies to match your configuration
  5. Build solution

Usage

  1. On the robot side (Linux/Jetson), build and run the streaming sender using the method shown HERE.
  2. On the user side (Windows), run the ''ZED_Stereo_Passthrough.exe'' in a terminal as it follows:
    ./ZED_Streaming_Receiver <ip:port>

Once it is executed, the stereo passthourgh from ZED camera to Oculus starts. Moreover, it will automatically run the main.py script, that provides the control system and iterface with the robot.

System hypothesis and future work

For future work, we aim to address the limitations of the current study. This includes implementing a more efficient communication system to control the robot from greater distances, and ensuring a comparable field of view between the HMD and the tablet-based controller.

About

This work describes an immersive control system for the Spot robot by Boston Dynamics, designed to track the head movements of the operator wearing the Meta Quest 2, while also utilizing touch controller commands for locomotion control.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published