You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
iCub-Tech is happy to announce our Software Distro 2020.08 📦
This is a major release that comprises binary packages as well. Therefore, follow the instructions available in the wiki to upgrade your system and/or download the new dependencies.
In particular, for time-based releases(e.g. 2020.05, 2020.08) we added a section HIGHLIGHTS were you can find for this release the detailed explanation about the work on the manipulation in the gazebo environment.
We invite you to check it out and see it with your own eyes 👀
✨ icub-main CMake modernization 🔨
We deeply refactored the cmake structure of icub-main libraries (see robotology/icub-main#652 for details) introducing the COMPONENTS(iCubDev ctrlLib skinDynLib iKin iDyn learningMachine perceptiveModels actionPrimitives optimization) and making the CMakeConfig relocatable.
✨ iCub gazebo manipulation ready 🖐️
Thanks to the great work of @xEnVrE of the HSP@IIT group and @pattacini, with this latest distro we have an iCub Gazebo model usable for visuo-manipulation.
It is called iCubGazeboV2_5_visuomanip and we created a sandbox to experiment with iCub performing basic object grasping within the Gazebo simulator.
Moreover, you can try out to deploy it directly from brand new robot-bazaar, just click here and have fun! 😉
🔘 Click here for more details about the manipulation on gazebo
📖 Overview
With Distro 2020.08 we provide a new URDF model of the iCub robot specifically tailored for manipulation tasks as part of the icub-models repository. The new iCubGazeboV2_5_visuomanip model inherits from the standard iCubGazeboV2_5 and enables the use of the hands and the eyes with full support for gazing. Also, the eyes implement vision functionalities via RGB and depth cameras.
Below, you can find a visual comparison of iCubGazeboV2_5 vs. iCubGazeboV2_5_visuomanip.
⚙ Implementation details
The new model is NOT automatically generated yet; rather, it is obtained by adding manual modifications on top of iCubGazeboV_2_5. In the long term, our goal is certainly to complete these steps and produce the URDF entirely automatically.
The kinematic description of the hands and the meshes are derived from Simox, whereas the kinematics of the eyes stems from the official iCub libraries.
Here's a list of PR's that allowed us to achieve this important milestone:
fix output of analogic encoders of fingers in order to match that of the real robot within the gazebo-yarp-pluginsmaissensor plugin.
🙏🏻 Acknowledgments
The development of the new model was made by xEnVrE of the HSP@IIT research line.
🤖 Applications
The new model has been already adopted in several projects. Here's below a couple of examples.
A sandbox for grasping with iCub
The repository icub-gazebo-grasping-sandbox is a self-contained dockerized sandbox that implements basic object grasping using the new iCub URDF within the Gazebo simulator.
Pouring experiments with iCub
The new model has also been used in an ongoing research project where iCub tracks a moving container while pouring into it.
This discussion was converted from issue #433 on December 12, 2020 12:44.
Heading
Bold
Italic
Quote
Code
Link
Numbered list
Unordered list
Task list
Attach files
Mention
Reference
Menu
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Brief description of the announcement
Hello @robotology/everyone 👋
iCub-Tech is happy to announce our Software Distro 2020.08 📦
This is a major release that comprises binary packages as well. Therefore, follow the instructions available in the wiki to upgrade your system and/or download the new dependencies.
You can find the tags composing this distro in our Software Versioning Table.
The binaries cover the following platforms:
Detailed context
This distro is shipped with several updates and fixes, including the ones listed hereafter:
🆕 Updates
✨ Robot bazaar website 🌐
We made public our brand new web site, designed by the Team CODE(@vtikha, @valegagge, @mbrunettini, @aerydna, @AlexAntn, @ilaria-carlini, @lauracavaliere, @vvasco) of iCub-Tech.:sparkles:
It contains:
In particular, for time-based releases(e.g.
2020.05
,2020.08
) we added a section HIGHLIGHTS were you can find for this release the detailed explanation about the work on the manipulation in the gazebo environment.We invite you to check it out and see it with your own eyes 👀
✨ icub-main CMake modernization 🔨
We deeply refactored the cmake structure of
icub-main
libraries (see robotology/icub-main#652 for details) introducing the COMPONENTS(iCubDev ctrlLib skinDynLib iKin iDyn learningMachine perceptiveModels actionPrimitives optimization
) and making the CMakeConfig relocatable.✨ iCub gazebo manipulation ready 🖐️
Thanks to the great work of @xEnVrE of the HSP@IIT group and @pattacini, with this latest distro we have an iCub Gazebo model usable for visuo-manipulation.
It is called
iCubGazeboV2_5_visuomanip
and we created a sandbox to experiment with iCub performing basic object grasping within the Gazebo simulator.Moreover, you can try out to deploy it directly from brand new
robot-bazaar
, just click here and have fun! 😉🔘 Click here for more details about the manipulation on gazebo
📖 Overview
With Distro 2020.08 we provide a new URDF model of the iCub robot specifically tailored for manipulation tasks as part of the
icub-models
repository. The newiCubGazeboV2_5_visuomanip
model inherits from the standardiCubGazeboV2_5
and enables the use of the hands and the eyes with full support for gazing. Also, the eyes implement vision functionalities via RGB and depth cameras.Below, you can find a visual comparison of iCubGazeboV2_5 vs. iCubGazeboV2_5_visuomanip.
⚙ Implementation details
The new model is NOT automatically generated yet; rather, it is obtained by adding manual modifications on top of
iCubGazeboV_2_5
. In the long term, our goal is certainly to complete these steps and produce the URDF entirely automatically.The kinematic description of the hands and the meshes are derived from Simox, whereas the kinematics of the eyes stems from the official iCub libraries.
Here's a list of PR's that allowed us to achieve this important milestone:
iCubGazeboV2_5_visuomanip
model from within theicub-models
repository.icub-models
repository for manually edited robots.iCubGazeboV_2_5_visuomanip
model.gazebo-yarp-plugins
control board plugin.gazebo-yarp-plugins
control board plugin.gazebo-yarp-plugins
maissensor
plugin.🙏🏻 Acknowledgments
The development of the new model was made by xEnVrE of the HSP@IIT research line.
🤖 Applications
The new model has been already adopted in several projects. Here's below a couple of examples.
A sandbox for grasping with iCub
The repository
icub-gazebo-grasping-sandbox
is a self-contained dockerized sandbox that implements basic object grasping using the new iCub URDF within the Gazebo simulator.Pouring experiments with iCub
The new model has also been used in an ongoing research project where iCub tracks a moving container while pouring into it.
Beta Was this translation helpful? Give feedback.
All reactions