A repository that stores the codebase for Python Interface to control Fetch Robot for Navigation purposes in NVIDIA Isaac Sim.
- Initialization of Fetch to move its base on a ground plane in Isaac Sim. The arm and the RGB camera are both initialized.
- Capability of moving the base to a given global position (x, y) within a given position tolerance (current best is 0.01 metres).
- Receiving a stream of images from the RGB camera attached to the prim:
head_tilt_link/head_camera_link
. - Synthetic data generation such as 2D and 3D bounding box detection and rudimentary segmentation using the Isaac Sim's built-in libraries.
-
You are required to install Omniverse, in which you would install NVIDIA Isaac Sim, its cache and set up a Nucleus server.
-
It is essential that the Nucleus server is running at all times when you are working with the Isaac sim, even for the standalone scripts.
-
All the configuration parameters are stored under
/config/config.ini
file. These could be changed as per the need of the application. Make sure that all the paths and the file names are correct.
Path to the Isaac Sim python would be of the following format:
~/.local/share/ov/pkg/isaac_sim-2023.1.1/python.sh
Execute the following script to launch the driver to navigate Fetch:
<path to isaac sim python> -m src.launch.fetch_nav_driver
Execute the following script to launch the sensor data handler's client endpoint for sensor simulation:
<path to isaac sim python or other python> -m src.sensor_data.sensor_data_handler
- Enter the global target position coordinates (x, y) in the terminal after executing the python script mentioned above. Enter (q, q) to exit.
- The script calculates the corrected coordinates to reduce the error of about 0.06 metres from the built-in Differential Controller to less than 0.01 metres. (Update: A bug in the coordinate correction logic was corrected in the latest release)
- Wait for our friend to reach its destination.
- Repeat as many times as you like.
Congratulations, you are now commanding a robot.
- Multiple redundant rotations while going long distances along the -x axis.
- Add logic to stop rendering either when the robot's position is within the tolerance limit or it has stopped moving.
- May be prone to errors while travelling long distances and very large or very small slopes (y/x).
- An RGB camera is attached to the head:
/World/fetch/head_tilt_link/head_camera_link/camera
; of the robot whose configuration parameters such as resolution, frame rate, etc. could be set in the config file. - Using multithreading, the frames are continuously sent from the
src/sensor_data.py
to an endpoint insrc/sensor_data/sensor_data_handler.py
using a POST request. - Thus, the client endpoint receives the raw unprocessed images and also visualizes them. This could also be used to perform perception and other related tasks.
The output could be sent back to the
src/launch_fetch_nav_driver.py
for the controller to take relevant actions. - This could be extended to simulate other sensors such as LiDAR.
- Multiprocessing with shared memory could be used to improve the current limited frame rate due to limited compute of multithreading.
- In case of an 150 (GLX) error, execute the following command:
export MESA_GL_VERSION_OVERRIDE=4.6
- In case of a
Detected blocking function call
error and consequential compromise on the program runtime, please check for a valid Nucleus server installation in Omniverse. Ensure that the server is running to avoid the error.