Gesture-Based Quadruped Control

Description

In this project, I developed software that uses hand gestures to send motion commands to the Unitree Go1 robot dog.
The go1-gesture-command repository consists of two ROS2 packages, one Python and one C++.


Packages


How It Works

Project Flowchart



Hand Gesture Recognition


Gestures Guide

  1. Open - stop
  2. Close - look forward (normal 0° yaw)
  3. Pointer - recover stand up
  4. OK - look up
  5. Peace - look down
  6. Thumbs Up - walk forward
  7. Thumbs Down - walk backward
  8. Quiet Coyote - lay down


I forked a repository from GitHub user Kinivi that includes a program and TensorFlow model for using MediaPipe to detect and label hand gestures. In my ros2_hgr package, I transformed their code into a ROS2 Python package that can publish data through a node. I also added new gestures (gestures 4-7) and retrained the model with new data for both the existing gestures and the new ones.


MediaPipe Hand Landmarks

Here, MediaPipe works by detecting 21 points throughout a hand. The model I use in this project identifies the locations of each of these points and, based on their overall configuration, labels the gesture.





Commanding the Go1

In another node, I receive the hand gesture labels and use them to send out a variety of commands to the Go1, employing the unitree_ros2 and unitree_nav packages mentioned in the prerequisites section.
The following video shows some movements that come pre-programmed and can be controlled via the provided remote control.



About the Unitree Go1

The Unitree Go1 is a quadruped robot advertised for its high dynamics, intelligence, and companionship abilities.



Notes

A large part of getting the Go1 up and running with Ubuntu 22.04 and ros2-humble consisted of the disassembly and updates performed as a group with other students with projects involving the Go1: Marno Nel, Nick Morales, and Katie Hughes.


go1-gesture-command GitHub Repository