This ROS packages enables the creation of a hierarchical 3D scene graph during runtime. In contrast to other solutions, rooms are segmented, classified and added as nodes as well. A robot using this package has to be equipped with a camera that provides an RGB stream, as well as a depth stream. Optimally, the robot is equipped with a 360° laser scanner. The package generates a 3D scene graph and comes with tools to visualize it using RViz.
Segments the current image from the RGB stream using YOLOv9 after synchronizing and publishes 2D object bounding boxes.
- RGB input stream:
/camera/color/image_raw
usingsensor_msgs/Image
messages - Depth input stream:
/camera/depth/points
usingsensor_msgs/PointCloud2
messages - Odometry input:
/camera/color/odom
usingnav_msgs/Odometry
messages
- Synchronized RGB stream:
/scene_graph/color/image_raw
usingsensor_msgs/Image
messages - Synchronized depth stream:
/scene_graph/depth/points
usingsensor_msgs/PointCloud2
messages - Synchronized odometry stream:
/scene_graph/odom
usingnav_msgs/Odometry
messages - Detected objects in current frame:
/scene_graph/detected_objects
usingscene_graph/DetectedObjects
messages
Places the detected objects in 3D global space using the robot's odometry.
- Synchronized RGB stream:
/scene_graph/color/image_raw
usingsensor_msgs/Image
messages - Synchronized depth stream:
/scene_graph/depth/points
usingsensor_msgs/PointCloud2
messages - Synchronized odometry stream:
/scene_graph/odom
usingnav_msgs/Odometry
messages - Detected objects in current frame:
/scene_graph/detected_objects
usingscene_graph/DetectedObjects
messages
- Synchronized RGB stream:
/scene_graph/seen_graph_objects
usingscene_graph/GraphObjects
messages
Classifies a single room utilizing a Random Forest Classifier on the base of given objects in that room.
- Objects in give room:
/scene/graph/room_with_objects
usingscene_graph/RoomWithObjects
messages
- Classified room label:
/scene_graph/classified_room
usingscene_graph/ClassifiedRoom
messages
Iteratively generates a 3D scene graph at runtime. Publishes messages to visualize the graph
- 3D bounding boxes of seen objects in the frame:
/scene_graph/seen_graph_objects
usingscene_graph/GraphObjects
messages - Polygons of segmented rooms:
/scene_graph/rooms
usingscene_graph/RoomPolygonList
messages - Label of classified room:
/scene_graph/classified_room
usingscene_graph/ClassifiedRoom
messages
- Objects in the graph:
scene_graph/graph_objects
usingscene_graph/GraphObjects
messages - Objects in a given room to be classified:
/scene/graph/room_with_objects
usingscene_graph/RoomWithObjects
messages - Visualization of object bounding boxes:
/scene_graph/viz/object_bbox_marker
usingvisualization_msgs/MarkerArray
messages - Visualization of room nodes:
/scene_graph/viz/room_markers
usingvisualization_msgs/MarkerArray
messages - Visualization of building nodes:
/scene_graph/viz/building_markers
usingvisualization_msgs/MarkerArray
messages - Visualization of edges:
/scene_graph/viz/line_markers
usingvisualization_msgs/MarkerArray
messages - Visualization of node labels:
/scene_graph/viz/text_markers
usingvisualization_msgs/MarkerArray
messages
- ROS Noetic on Ubuntu 20.04 LTS
- Camera that provides RGB- and depth-streams (e.g. Orbbec Astra)
- High-FOV laser scanner
- Robot compatible with ROS that comes with an IMU
To install this package, install the CGAL library and the ultralytics package first. After that, clone and build this package.
pip install ultralytics
sudo apt-get install libcgal-dev
cd <your_catkin_workspace>/src
git clone https://github.com/sijanz/scene_graph
cd .. && catkin_make
Run the whole scene graph creation pipeline as follows. Use a new terminal window for each command.
rosrun scene_graph ros_yolo_node.py
rosrun scene_graph object_localization_node
rosrun scene_graph room_classification_node
rosrun scene_graph graph_management_node