Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[P] My implementation of object tracking using an Xbox 360 Kinect, a dynamixel Pan/Tilt turret, ROS and YOLOv3

This is a little video clip I made of a project of mine where I’ve used the 2D bounding box data from PJReddie’s YOLOv3 to guide the joint positions of a pan tilt servo.

https://m.youtube.com/watch?v=_iQWCRToUkA

The simplified version of how this works, is that I created an ROS node that utilizes OpenCV to syncronize the depth frame with the RGB camera frame from the Kinect 360. In the callback function, it then takes the centermost rgb pixel matrix coordinates of the bounding box of a detected object, and retrieves the depth data from the syncronized depth image (uv,xyz), and converts that to a pose-stamped message that is sent to a modified “head_tracker.py” module from the original rbx2 code (sourced below).

Some of the prep work and background on this project:

I used Google’s OpenImages V4 to train YOLO and create the pretrained weights for “Human Head” detection that I’m using here. (V5 is out now?.. mmmm juicy!) Here’s my tutorial I made for this process: https://github.com/WyattAutomation/Train-YOLOv3-with-OpenImagesV4

The version of ROS used here is Melodic, running on Xubuntu 18.04. You have to build the freenect dependencies and ROS package for the 360 Kinect from source (doesn’t appear to be an apt install option for melodic?)

I also used leggedrobotic’s darknet_ros ROS module for YOLO: https://github.com/leggedrobotics/darknet_ros

The robotics hardware I used here is a PhantomX pan/tilt turret from Trossen robotics with 2 Dynamixel ax18-a’s: https://www.trossenrobotics.com/p/phantomX-robot-turret.aspx

And probably the most important to mention out of all, I rewrote the “nearest_pointcloud” ROS node and other code from “Robotics by Example, Vol2”, retrofitting it for the purpose of tracking objects in 3D space from the 2D bounding box pixel coordinates published by the YOLOv3 ROS package: https://github.com/pirobot/rbx2

I’ll be willing to answer any questions on how I got this working, so feel free to ask!

I have a version of this that uses an Astra Orrbec Pro and I am nearly finished setting it up to run on a Raspberry Pi 4, with Darknet/YOLO running remotely and publishing rostopics over WAN or LAN from my desktop (using a GTX 1060 for YOLO).

I will post progress as it is made, as well as comprehensive documentation and source code on my GitHub account on this build, for whoever it may help. If you like it, feel free to visit my Patreon!

submitted by /u/Oswald_Hydrabot
[link] [comments]