Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it possible to define which GPU an indirect display attaches to? #43

Open
DistractionRectangle opened this issue Feb 11, 2024 · 10 comments
Labels
enhancement New feature or request question Further information is requested

Comments

@DistractionRectangle
Copy link

Currently, the present behavior is to attach to whatever GPU was used during POST. This can be problematic as if the POST GPU is integrated graphics are a dGPU that's different from the one that's sunshine uses for video capture. While a work around is simple (just change preferred display adapter used for POST in BIOS), there are valid use cases and workflows wherein one might want a different POST GPU than the indirect adapter attaches to.

It's touched upon in issues #6, #9, and #41.

~ Sent a remote desktop rendered on the VDD driver ❤️

@bud3699
Copy link
Collaborator

bud3699 commented Feb 16, 2024

Is this more of a feature you want added than a issue ??

@bud3699 bud3699 added enhancement New feature or request question Further information is requested labels Feb 16, 2024
@DistractionRectangle
Copy link
Author

It's mostly a question if the feature already exists. From what I gather, it wasn't possible in the past, but don't know if it changed recently - as was the case with HDR. Further, given that the official documentation isn't great, I figured it'd be easier to ask those already familiar with it than try chase it down solo. Based on your response I take it currently isn't supported.

If it is possible, I'd be happy to try to contribute the code for that if I can get a nudge towards the relevant documentation.

@franklin050187
Copy link

Also looking on what to change to specify gpu.
Use case : i use parsec to connect to a headless server that has 2 gpu but when i load one gpu for ai inference parsec crashes. I am looking for a way to have parsec stream with the second gpu.
I did read somewhere that it is attached to the gpu used in post so it could be feasable.
Does it use wddm ?

@abhidash29
Copy link

abhidash29 commented Mar 4, 2024

I'm also looking for a way to handle this issue.

I currently have VDD set up to stream with Moonlight/Sunshine on my local network, so I can use an old Surface Pro as a secondary display for my laptop. VDD assigned the virtual display to the Intel iGPU instead of the Nvidia dGPU. Bit of a problem when I'm also streaming a 720p+ YouTube video at the same time, since Sunshine is forced to use the encoder of the GPU it thinks the display is connected to. I guess the simultaneous encode/decode of two H.264 streams is too much for it to handle, because CPU temps shoot up pretty quickly making other applications more sluggish while the dGPU just idles.

It would be great if there was a way to specify which GPU to use during installation, maybe through the option.txt file.

// --- Line 242, Driver.cpp ---
// Find the specified render adapter
hr = DxgiFactory->EnumAdapterByLuid(AdapterLuid, IID_PPV_ARGS(&Adapter));

I'm guessing that something needs to be changed/added here, but I don't really know what that would entail.

EDIT:
Looks like this could be a potential solution? - https://stackoverflow.com/questions/49702059/dxgi-integred-adapter
It's not as convenient as explicitly specifying a GPU by name, but I feel like ordering from highest to lowest performance and selecting the first or last in the list will be sufficient for most people running into this issue.

EDIT 2:
I attempted to work around this by disabling and uninstalling the iGPU in Device Manager, so VDD is forced to select the dGPU when being installed. Unfortunately, the virtual display somehow gets re-assigned to the iGPU as soon as I enable it again. I don't really know enough about DXGI and the IDD to understand why this occurs. :(

@Goldenfreddy0703
Copy link

Hey so I'm sorry to say this but I don't think this is possible cause I've been doing alot of research on this and ever sense that windows 10 april 2018 update, it's ruined alot of computers especially mine due to Microsofts graphic settings. We don't have the option to change which adaptor can utilize the display, it just defaults to igpu which absolutely sucks. I tried everything when doing research and there is honestly no solutions out there. Maybe that stackoverflow may help but I'm not too sure. If you guys do find a solution, please notify me cause it would definitely help if you guys figure out how to have the dgpu as an external display.

Thank you

@franklin050187
Copy link

Found a workaround as follow :
(specs : amd rx570 used for streming display with parsec in slot 2, RTX3090 used for gpu compute in slot 1)
plug in hdmi dongle in gpu used to display screen or cast it (here on the rx570)
then used this script https://github.com/nethe-GitHub/select_default_GPU
it enables tha ability to select high gpu and low gpu
setup windows to use low gpu for app parsec
setup windows to use high gpu for app any game here

if this can help you, not all in one solution but it works

@Goldenfreddy0703
Copy link

That actually might help, thank you. I may test this tomorrow.

@franklin050187
Copy link

I'm also looking for a way to handle this issue.

I currently have VDD set up to stream with Moonlight/Sunshine on my local network, so I can use an old Surface Pro as a secondary display for my laptop. VDD assigned the virtual display to the Intel iGPU instead of the Nvidia dGPU. Bit of a problem when I'm also streaming a 720p+ YouTube video at the same time, since Sunshine is forced to use the encoder of the GPU it thinks the display is connected to. I guess the simultaneous encode/decode of two H.264 streams is too much for it to handle, because CPU temps shoot up pretty quickly making other applications more sluggish while the dGPU just idles.

It would be great if there was a way to specify which GPU to use during installation, maybe through the option.txt file.

// --- Line 242, Driver.cpp ---
// Find the specified render adapter
hr = DxgiFactory->EnumAdapterByLuid(AdapterLuid, IID_PPV_ARGS(&Adapter));

I'm guessing that something needs to be changed/added here, but I don't really know what that would entail.

EDIT: Looks like this could be a potential solution? - https://stackoverflow.com/questions/49702059/dxgi-integred-adapter It's not as convenient as explicitly specifying a GPU by name, but I feel like ordering from highest to lowest performance and selecting the first or last in the list will be sufficient for most people running into this issue.

EDIT 2: I attempted to work around this by disabling and uninstalling the iGPU in Device Manager, so VDD is forced to select the dGPU when being installed. Unfortunately, the virtual display somehow gets re-assigned to the iGPU as soon as I enable it again. I don't really know enough about DXGI and the IDD to understand why this occurs. :(

did you try to enumerate adaptaters ?
// Enumerate all available GPU devices
std::vector adapters = EnumerateAdapters(DxgiFactory);

my programming skills are not enough to go beyon that but i bet you batch it in a way that when installaing, enumerate adaptater and then pick the one to attach the monitor to

@anonymous-gy
Copy link

This will be very useful for some old laptop.
The old laptop will use IGPU(the gpu in cpu, not good for gaming) if not connecting an extra displayer, making video stream work in h264 level, wasting CPU performance.

@matheusfaustino
Copy link

Yes, see this: #110 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request question Further information is requested
Projects
None yet
Development

No branches or pull requests

7 participants