-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Whether the program is common to HTC Vive devices #98
Comments
Hi @SSSSamZhu, thanks for opening the issue. The pipeline we used in https://www.youtube.com/watch?v=r6bFwUPStOA is still in progress, and it is not thoroughly documented yet. In that case, we are using https://github.com/robotology/human-dynamics-estimation/tree/master to estimate the operator's body posture. Then, we use the For what concerns the use of the HTC Vive, we exploit this device: https://github.com/ami-iit/yarp-device-openxrheadset. Also in this case, its use is not very documented yet (but I am working on a README https://github.com/ami-iit/yarp-device-openxrheadset/tree/readme). #71 was a first rough attempt to use the VIVE joysticks to command the robot arms. On the other hand, we are heading in a different direction, and that PR probably will not be merged. I think that the easiest thing you can try would be to control just the robot neck with the HTC VIVE. The steps more or less would be:
Again, apologies for missing some good documentation about the whole pipeline, but it is an active research areas at the moment and many things are changing. Let me know if you need additional details. |
Hi @S-Dafarra ,
Thank you for your answer and I'm sorry for taking up your time. |
Hi @S-Dafarra , Thank you so much for helping us. |
Apologies for the late answer, it has been a busy period. Let me try to go through your questions.
Yes, now the
Yes, you would need to setup the Windows system using the
Yes, to use |
Hi @S-Dafarra ,
Based on what you did before, can I understand that I can use Vive HTC to reproduce some other functions of your project? Because I saw the vive HTC used in the video .Maybe this part of the function is to send the iCub robot's eyes back to the Vive HTC? This part of the functionality is also expected !
Finally, thank you for your patience. |
For the control of the gaze, you can check the
I would suggest referring to https://www.yarp.it/latest/ and https://github.com/robotology/robotology-superbuild
This probably means that the |
Hi @S-Dafarra , As I mentioned earlier, we want to add solid presentation and interaction capabilities to icub that we already have, since it may need more interactive entertainment in the future. I noticed in your video, published in 2019, https://www.youtube.com/watch?v=yELyMYkCyNE there are used to Ocular fragments inside. I think this will be a stable technical solution and this is the effect of using Joypad + Oculus. Could you tell us the matching version and model you are using? In order to make this project looks more clean and efficient communication, maybe we can discuss our subsequent transfer to https://github.com/robotology/community/discussions. Finally, thank you very much for your patience. |
No problem at all, do not worry 😉
We were using an old Oculus Rift (one of the first). It is very old, and it cannot be bought anymore. In that case, we were using this other device, but since we moved to the use of the VIVE we are not using it anymore (hence it is not maintained). Notice that also in this way we needed In order to avoid using As mentioned above, we are still working on a simpler retargeting application using just the controllers, but this is still a work in progress. |
Do you mean Oculus Rift CV1? There seems to be some stock in the Chinese market. It would be great to have your exact brand.
In the superbuild introduction, I saw the oculus module's function description.
All in all, does this mean that using Joypad + Oculus alone is not feasible? https://github.com/robotology/walking-teleoperation/wiki I saw it on a wiki, and it seems a little old.
However, I remember you said that the use of the VIVE controllers to control the robot hands is not fully supported yet. You mean you use VIVE as an expression recognition module? Use XSens for full-body repositioning, not VIVE or Oculus. |
Back in time, we did use the Oculus joysticks to control the robot (though Anyhow, I guess we are getting a bit off-topic. If your aim is to use a VIVE headset to control the robot, right now the easiest that you can try is to work with the head and gaze retargeting only, as mentioned in #98 (comment). This requires you to install the |
So sorry, maybe my first question was misleading. What we want to do is reproduce an interactive technology solution on ICub, like you did before, using Joypad + Oculus to control the movement of the upper body of the robot. The first solution I came up with was to use VIVE, because we happen to have a VIVE in our lab. Thanks. |
That's clear. The systems you have seen in the videos have been very complex to set up and require a lot of configuration and testing. In all those cases, we were controlling all the robot joints through the Right now I don't have any how-to guide, nor any steps to follow. We do not have anything ready at the moment to control just the upper body using the VIVE or any other joystick, but it should be possible. In any case, this will require running a series of different modules and many things can go wrong in the middle. Hence, as a first step, I was suggesting to start controlling only the neck, so that at least we can start testing the system. I understand this is not exactly what you want, but I don't know how to help you otherwise. |
Moring, @S-Dafarra
I also noticed that you used tracker for relocation, is this scheme still supported and maintained? It seems like a great idea. |
Yes, we do use trackers. They are also supported in https://github.com/ami-iit/yarp-device-openxrheadset. With the same pipeline is possible to get the position and orientation of joysticks. Controlling the robot accordingly is another story |
Hi, @S-Dafarra
|
Hi @SSSSamZhu, I am afraid this would be much harder than expected. I have been discussing this with some colleagues in the lab and we will try to release an application like those in https://robot-bazaar.iit.it/applications to perform upper-body retargeting. Nonetheless, this might take several weeks. Hope it is not a problem for you. |
Hi, @S-Dafarra, Thanks for your quick reply.
|
In principle, it should be possible to use every headset compatible with |
Hi, @S-Dafarra,
If you don't mind, I look forward to participating in your VIVE testing when the prototype application is completed. Because, you know, this issue is taking a little too long. I can't wait to see icub in action. Thanks, |
Hi @SSSSamZhu I have recently put together the following file to run neck and eyes retargeting. Please use the following
The app to run upper body retargeting will take some more time to be developed. |
Hi, @S-Dafarra,
I think it will work.
There are two types of files in the package, one is.xml, which I put in the application folder of Superbuild
I've been running
Thanks. |
Hi @SSSSamZhu, that seems to be an issue with your yarp installation and I would suggest to open an issue in the relevant repository. Otherwise I would suggest you to wait until we release the app I was mentioning above. |
Hi, @S-Dafarra,
|
You did not need to copy those files anywhere, it is sufficient to launch |
Hi, @S-Dafarra,
|
Hi @SSSSamZhu make sure to connect also the ports in the Teleoperation application. If that does not work, make sure that you are using the external graphic card. You can check https://www.dell.com/support/kbdoc/it-it/000190229/how-to-set-nvidia-video-as-the-default-with-computers-that-have-integrated-and-discrete-video-cards?lang=en on how to do that. |
Hi @S-Dafarra ,
I tried to connect the Ports in the Teleoperation application again, but failed. So far the problem seems to be in the connection port. Does this have anything to do with my configuration on Linux? Because I notice that the names on both ends of "from" and "to" have turned green. Moreover, when I wanted to choose the independent graphics card as the main device, I found that my computer only read the independent graphics card, but did not detect the integrated graphics card on the computer motherboard. Maybe he used a separate graphics card in the first place. Sorry for the inconvenience caused by the Chinese system. Thanks. |
Hi @S-Dafarra ,
Finally, I changed the |
Great! You can check if the mjpeg carrier is correctly installed on both the Windows machine and on the iCub head: https://www.yarp.it/git-master/group__carrier__config.html#carrier_config_mjpeg |
Hi, @S-Dafarra , The solution I tried:
Then I think I need to turn on YARP_COMPILE_CARRIER_PLUGINS and then ENABLE_yarpcar_mjpeg_carrier in CMake, as mentioned in the link. Awkwardly, because I use SuperBuild, I can't find the YARP folder and open it separately. Please tell me what to do next. |
Fill in some details about the machine's complaints.
Finally, I would like to know when the application will be available. I hope this question doesn't come across too abruptly. |
Hi @SSSSamZhu, sorry but the problems you are listing above are not specific to this repo. I would like to avoid cluttering this issue too much. I would suggest you open specific issues in the relevant repos. The idea behind the application was specifically meant to avoid these kinds of configuration issues. On the other hand, its development might take some time. I will keep you posted in case there is any update. |
Hi,
For the need of robot display and interaction, we hope to repeat the work on VR head-mounted display (through Vive remote operation of the upper body cloud movement of the robot).But our VR headset doesn't seem to be the Oculur you're using.I have noted your relevant discussion. Does this mean I can reproduce your results on Vive at present?
#60
#71
from iCub Shenzhen01
Thanks!
The text was updated successfully, but these errors were encountered: