Nimble
Member
I've lately been utilizing OBS Virtual camera, and was hoping to do so at 4K60.
Up until now I've been using a GTX 1080, but I quickly realized that this wasn't quite powerful enough to run OBS with a 4K60 base canvas and Virtual Camera getting pulled by 3 applications at once, so I decided I'd upgrade to an RTX 2080 Ti. Before I did, I got with a friend who already had an RTX 2080 Ti to verify that it could handle what I wanted, and it could. With a 4K60 canvas and Virtual Camera running, his 3D usage was below 30%. However, once I got my hands on an RTX 2080 Ti I'm observing much worse performance, especially when compared to what I observed on his system, and even worse than my GTX 1080.... What?
As you can see the GTX 1080 is under a 69% load while the RTX 2080 Ti is under a 82% load when running an instance of OBS with identical settings. To choose which GPU I'm using I plug my main display into the GPU I'd like to use, and then I restart my PC. Windows sets the primary GPU via whichever one is running the primary display at boot.
I thought the issue might be that there are 2 GPUs or something, so I completely disabled the GTX 1080, but no dice... It's also worth noting that both GPUs are in x16 slots, I'm running them in a system with a AMD Threadripper 1950X. Next I thought that maybe the 2080 Ti I got is defective, but when I run a basic benchmark (UserBenchMark) outside of OBS it shows the 2080 Ti performing 70% better than my GTX 1080.
I've attached two log files, the first one is with the GTX 1080 as the primary GPU, and the second one is with the RTX 2080 Ti as the primary GPU. I'm completely baffled, any idea how this could be happening?
Up until now I've been using a GTX 1080, but I quickly realized that this wasn't quite powerful enough to run OBS with a 4K60 base canvas and Virtual Camera getting pulled by 3 applications at once, so I decided I'd upgrade to an RTX 2080 Ti. Before I did, I got with a friend who already had an RTX 2080 Ti to verify that it could handle what I wanted, and it could. With a 4K60 canvas and Virtual Camera running, his 3D usage was below 30%. However, once I got my hands on an RTX 2080 Ti I'm observing much worse performance, especially when compared to what I observed on his system, and even worse than my GTX 1080.... What?
As you can see the GTX 1080 is under a 69% load while the RTX 2080 Ti is under a 82% load when running an instance of OBS with identical settings. To choose which GPU I'm using I plug my main display into the GPU I'd like to use, and then I restart my PC. Windows sets the primary GPU via whichever one is running the primary display at boot.
I thought the issue might be that there are 2 GPUs or something, so I completely disabled the GTX 1080, but no dice... It's also worth noting that both GPUs are in x16 slots, I'm running them in a system with a AMD Threadripper 1950X. Next I thought that maybe the 2080 Ti I got is defective, but when I run a basic benchmark (UserBenchMark) outside of OBS it shows the 2080 Ti performing 70% better than my GTX 1080.
I've attached two log files, the first one is with the GTX 1080 as the primary GPU, and the second one is with the RTX 2080 Ti as the primary GPU. I'm completely baffled, any idea how this could be happening?