Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Do I have to buy a VR device to test the demo? #10

Open
YiChenCityU opened this issue Jan 4, 2023 · 4 comments
Open

Do I have to buy a VR device to test the demo? #10

YiChenCityU opened this issue Jan 4, 2023 · 4 comments
Labels
question Further information is requested

Comments

@YiChenCityU
Copy link

As above.

@keli95566
Copy link
Member

Yes, the main focus of this repository target immersive applications with a VR headset + steamVR. Please feel free to fork this repo and build immersive desktop applications.

@keli95566 keli95566 added the question Further information is requested label Jan 5, 2023
@rocco-haro
Copy link

Hi @keli95566 , we are interested in following your recommendation on forking this repo to build a desktop application. Could you give us some pointers on what would have to change to make this repo work in non-VR settings? Or perhaps a better question/request is some architectural documentation on how this repo currently operates?

At the moment we have a rigid tech stack where we use openGL and URP do perform server side rendering and stream it to the browser. Is your repo running instant-gp alongside Unity, or are you importing the mesh converted from NeRF into Unity?

And as always, thank you very much for your contribution to the open source community. You're a blessing!

@keli95566
Copy link
Member

Hi @rocco-haro , thank you for the questions!

Q: "Is your repo running instant-gp alongside Unity, or are you importing the mesh converted from NeRF into Unity?"
A: The repo runs on top of instant-ngp through a Unity native rendering plugin we developed. This manuscript documents how Unity native rendering works together with instant-ngp. It is actually quite simple, since instant-ngp uses OpenGL backend, we only need to pass texture handle pointers from Unity to instant-ngp for instant-ngp to render images to these Unity textures directly.

Q: "Could you give us some pointers on what would have to change to make this repo work in non-VR settings? "
A: To develop for non-VR settings, feel free to refer to the StereoNerfRenderer.cpp and the unity.cu scripts. For VR application, the renderer has two render buffers to create stereoscopic images. I think for desktop applications, you could remove one render buffer, and convert the camera transform in Unity to a camera view matrix like in the StereoNerfRenderer.cpp.

It would certainly be interesting to see a desktop application built from Unity, and I hope the quick explanation helps!

@rocco-haro
Copy link

Fantastic explanation, thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants