Skip to content

A ROS node using faster RCNN to perform image recognition

Notifications You must be signed in to change notification settings

ChielBruin/ros_faster_rcnn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

37 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ROS_faster_rcnn

A ROS wrapper for the python implementation of faster-RCNN. This wrapper is based on demo.py, that is included in the python implementation. It publishes messages containing the class, position, size and probability of the detected objects in the received images.

Faster R-CNN is an object detection framework based on deep convolutional networks, which includes a Region Proposal Network (RPN) and an Object Detection Network. Both networks are trained for sharing convolutional layers for fast testing.

Faster R-CNN was initially described in an arXiv tech report.

Installation

  • Clone the repository
    • git clone https://github.com/ChielBruin/ros_faster_rcnn.git --recursive
    • run git submodule update --init --recursive, when the modules are not correctly cloned
  • Install py-faster-rcnn located in the libraries folder
    • Follow the guide provided here
    • If you are running Ubuntu 15 or 16, check this or this guide (respectively) for the installation of the caffe dependency
  • Install all the needed ROS dependencies
    • run rosdep install -y --from-paths src --ignore-src --rosdistro $ROS_DISTRO, where $ROS_DISTRO is your desired version of ROS

Development notes

This ROS node is being developed as a part of this repository. The wrapper functions correctly, but some features are still missing:

  • A ROS service to send an image and receive the detections
    • This function will execute on the CPU due to this issue, therefore it is tricky to implement correctly

About

A ROS node using faster RCNN to perform image recognition

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •