Computer Vision Aided Robotic Arm Manipulation for Grasping Objects
Using OpenCV and the Intel RealSense D435i camera to help a PX100 robot arm grab a purple pen out of the air.
Link to this project’s Github
Introduction
The goal of this project is to enable the PincherX 100 robot arm to grab a pen that’s held in front of it. The tools used to complete this task are OpenCV, Numpy, a Linux FIFO pipe, Python, an Intel RealSense D435i, and a PincherX 100 robot arm.
How to Run
- Clone the repository.
- Plug your RealSense D435i camera into a USB3 port on your computer.
- Plug your PincherX 100 robot into any USB port on your computer.
- Place the RealSense and the PincherX 100 at 90 degree angles to each other. Do your best to place the Pincher so that its end-effector is around x = 260, y = 150, z = -60 (units: mm, x axis is in and out of the camera, y axis is left and right across the camera, z axis is in and out of the table). You’ll probably want to play around with these values on line 16 of robot_control.py. Sorry for hardcoding! Will improve upon this eventually.
- In one terminal window, run:
$ ros2 launch interbotix_xsarm_control xsarm_control.launch.py robot_model:=px100
- In another terminal window, run:
$ python pen_recognition.py
- In another terminal, run:
$ python robot_control.py
- Hold the pen up in front of the camera, and within the Pincher’s workspace.
Implementation
Computer Vision
First, I create a color mask using HSV values that looks for the color of the pen in my hand: purple. Next, I calculate the geometric centroid of the shape revealed by the color mask after finding its contours. Finally, I access the value at the position of the centroid from the depth generated by the RealSense camera to figure out the location of the pen in 3D space.
Robot Manipulations
After finding the 3D position of the pen in the camera’s frame, I think translate this position to be in the frame of the robot. Once this point has been calculated, I call the set_ee_cartesian_trajectory() from the PincherX’s library to start planning and executing a trajectory to the location of the pen.
Once this trajectory is finished, the gripper will close, and the robot will have grasped the pen.
Future Work
I completed this project before I knew anything about ROS or basic robotics math like transformation matrices. I plan to go back and redo the project at some point in the near future to improve upon it, and to learn more about computer vision in the process. There are some hard coded values which can be solved with April Tags or maybe additional color masks.
I plan to eventually go back and make this project a ROS2 Iron package.
Gallery
Project Demo
Computer Vision